Apr 23 08:48:20.164548 ip-10-0-136-146 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 08:48:20.164617 ip-10-0-136-146 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 08:48:20.164627 ip-10-0-136-146 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 08:48:20.165006 ip-10-0-136-146 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 08:48:30.325492 ip-10-0-136-146 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 08:48:30.325512 ip-10-0-136-146 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b19596e33d8242d4a5941ed62f235a2b -- Apr 23 08:50:30.477077 ip-10-0-136-146 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:50:30.952448 ip-10-0-136-146 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:30.952448 ip-10-0-136-146 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:50:30.952448 ip-10-0-136-146 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:30.952448 ip-10-0-136-146 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:50:30.952448 ip-10-0-136-146 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:30.955340 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.955251 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:50:30.959591 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959567 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:30.959591 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959586 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:30.959591 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959590 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:30.959591 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959593 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:30.959591 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959598 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959603 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959606 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959610 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959613 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959616 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959619 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959622 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959625 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959628 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959631 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959635 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959637 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959640 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959643 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959645 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959648 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959651 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959656 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959659 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:30.959809 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959662 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959665 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959667 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959670 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959672 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959675 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959677 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959681 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959684 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959686 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959689 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959691 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959694 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959696 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959699 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959703 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959723 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959725 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959728 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959731 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:30.960293 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959734 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959736 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959739 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959742 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959744 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959748 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959752 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959755 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959757 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959760 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959762 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959765 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959768 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959770 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959773 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959775 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959778 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959781 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959783 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959786 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:30.960810 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959789 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959792 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959795 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959798 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959800 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959803 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959807 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959810 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959813 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959816 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959819 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959822 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959824 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959827 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959831 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959834 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959836 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959839 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959842 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:30.961300 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959844 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959847 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.959850 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960255 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960260 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960262 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960265 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960268 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960270 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960273 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960277 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960280 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960283 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960286 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960289 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960291 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960294 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960297 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960300 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960303 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:30.961824 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960306 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960309 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960311 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960314 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960317 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960319 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960322 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960325 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960328 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960331 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960333 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960336 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960338 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960341 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960344 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960346 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960349 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960351 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960354 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960357 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:30.962315 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960360 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960362 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960365 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960368 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960370 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960373 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960376 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960378 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960380 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960383 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960385 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960388 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960391 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960393 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960396 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960399 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960401 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960405 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960407 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960410 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:30.962832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960413 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960416 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960418 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960421 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960423 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960427 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960431 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960434 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960437 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960440 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960442 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960445 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960447 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960450 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960453 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960455 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960459 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960461 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960464 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:30.963347 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960467 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960469 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960472 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960475 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960478 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960480 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960483 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960485 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960487 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.960490 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961773 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961784 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961790 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961795 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961802 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961805 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961810 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961819 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961822 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961826 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961830 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:50:30.963852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961833 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961837 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961840 2574 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961843 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961846 2574 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961849 2574 flags.go:64] FLAG: --cloud-config="" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961852 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961855 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961861 2574 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961864 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961867 2574 flags.go:64] FLAG: --config-dir="" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961870 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961873 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961877 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961881 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961884 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961888 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961891 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961894 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961896 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961899 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961903 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961908 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961912 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961914 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:50:30.964371 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961917 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961921 2574 flags.go:64] FLAG: --enable-server="true" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961924 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961929 2574 flags.go:64] FLAG: --event-burst="100" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961932 2574 flags.go:64] FLAG: --event-qps="50" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961936 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961939 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961942 2574 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961946 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961949 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961952 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961955 2574 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961958 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961961 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961964 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961966 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961969 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961972 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961975 2574 flags.go:64] FLAG: --feature-gates="" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961979 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961982 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961986 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961989 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961992 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961996 2574 flags.go:64] FLAG: --help="false" Apr 23 08:50:30.964984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.961998 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-136-146.ec2.internal" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962002 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962004 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962007 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962011 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962014 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962017 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962020 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962023 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962027 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962030 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962033 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962036 2574 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962039 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962042 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962045 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962048 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962051 2574 flags.go:64] FLAG: --lock-file="" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962054 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962057 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962061 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962066 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962069 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962072 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:50:30.965597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962075 2574 flags.go:64] FLAG: --logging-format="text" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962078 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962081 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962084 2574 flags.go:64] FLAG: --manifest-url="" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962087 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962091 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962094 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962099 2574 flags.go:64] FLAG: --max-pods="110" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962102 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962105 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962107 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962110 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962114 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962117 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962121 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962129 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962132 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962135 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962139 2574 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962142 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962147 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962150 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962153 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962156 2574 flags.go:64] FLAG: --port="10250" Apr 23 08:50:30.966202 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962159 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962162 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f8b47efa13623d7a" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962165 2574 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962168 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962171 2574 flags.go:64] FLAG: --register-node="true" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962174 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962177 2574 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962181 2574 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962184 2574 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962186 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962189 2574 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962193 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962196 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962199 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962202 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962205 2574 flags.go:64] FLAG: --runonce="false" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962208 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962211 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962214 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962217 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962220 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962223 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962226 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962230 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962233 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962236 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:50:30.966865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962238 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962242 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962245 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962248 2574 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962251 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962256 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962259 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962262 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962267 2574 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962269 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962272 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962280 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962283 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962286 2574 flags.go:64] FLAG: --v="2" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962290 2574 flags.go:64] FLAG: --version="false" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962294 2574 flags.go:64] FLAG: --vmodule="" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962299 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.962302 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962408 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962412 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962416 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962418 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962421 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962424 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:30.967537 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962427 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962430 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962434 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962437 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962440 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962442 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962445 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962447 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962450 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962452 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962456 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962458 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962462 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962465 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962467 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962470 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962472 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962475 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962477 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962482 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:30.968134 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962484 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962487 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962491 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962494 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962497 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962500 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962503 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962507 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962510 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962512 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962515 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962517 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962520 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962522 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962527 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962530 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962532 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962535 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962537 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962540 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:30.968650 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962542 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962545 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962548 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962551 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962553 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962556 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962558 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962561 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962565 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962568 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962571 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962575 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962577 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962580 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962583 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962585 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962588 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962590 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962593 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:30.969183 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962595 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962598 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962600 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962603 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962605 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962608 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962611 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962617 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962620 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962623 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962625 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962627 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962630 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962633 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962635 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962638 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962640 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962644 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962646 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962649 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:30.969657 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.962651 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:30.970204 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.963681 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:30.970334 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.970315 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:50:30.970368 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.970335 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:50:30.970396 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970383 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:30.970396 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970389 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:30.970396 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970393 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:30.970396 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970396 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970399 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970402 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970405 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970408 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970411 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970413 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970416 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970419 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970422 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970425 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970427 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970430 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970432 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970436 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970438 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970443 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970449 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:30.970498 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970453 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970457 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970460 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970463 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970465 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970468 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970471 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970473 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970476 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970478 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970482 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970485 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970487 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970490 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970492 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970496 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970500 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970503 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970506 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970509 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970512 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:30.971057 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970514 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970517 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970520 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970523 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970526 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970529 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970532 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970535 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970538 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970541 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970543 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970546 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970549 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970552 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970554 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970557 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970559 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970562 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970565 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970567 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:30.971594 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970570 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970573 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970577 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970579 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970582 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970585 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970588 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970591 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970594 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970597 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970599 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970602 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970605 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970608 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970610 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970613 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970615 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970618 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970621 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970623 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:30.972120 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970626 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970629 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970631 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970634 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.970639 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970770 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970776 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970779 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970782 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970785 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970788 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970791 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970793 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970797 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970802 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:30.972603 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970805 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970808 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970810 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970813 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970816 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970819 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970822 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970825 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970828 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970830 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970833 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970836 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970838 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970841 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970843 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970846 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970848 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970851 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970854 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970856 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:30.972985 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970859 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970861 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970864 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970866 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970869 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970872 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970874 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970877 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970880 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970883 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970885 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970888 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970890 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970894 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970896 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970899 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970902 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970904 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970907 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970909 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:30.973471 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970912 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970915 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970917 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970920 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970922 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970925 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970928 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970930 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970933 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970935 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970938 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970941 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970944 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970947 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970949 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970952 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970954 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970956 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970959 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970962 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:30.974023 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970964 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970968 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970971 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970974 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970977 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970979 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970982 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970986 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970988 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970991 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970993 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970996 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.970998 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.971000 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.971003 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:30.974508 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:30.971006 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:30.974890 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.971012 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:30.974890 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.971824 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:50:30.979271 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.979256 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:50:30.980256 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.980245 2574 server.go:1019] "Starting client certificate rotation" Apr 23 08:50:30.980361 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.980344 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:50:30.980391 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:30.980382 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:50:31.008822 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.008798 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:50:31.013171 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.013145 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:50:31.027564 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.027543 2574 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:50:31.033466 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.033446 2574 log.go:25] "Validated CRI v1 image API" Apr 23 08:50:31.034519 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.034503 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:50:31.040873 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.040857 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:50:31.041785 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.041766 2574 fs.go:135] Filesystem UUIDs: map[18d06db7-67a8-4d23-89c3-cbdabb6a9dcb:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 efe283dd-b3ab-4345-846f-77f723279931:/dev/nvme0n1p4] Apr 23 08:50:31.041829 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.041787 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:50:31.047800 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.047644 2574 manager.go:217] Machine: {Timestamp:2026-04-23 08:50:31.045524654 +0000 UTC m=+0.438427696 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3127753 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2039cfc4f42df75e1299ccd55566ee SystemUUID:ec2039cf-c4f4-2df7-5e12-99ccd55566ee BootID:b19596e3-3d82-42d4-a594-1ed62f235a2b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1c:b8:34:ba:29 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1c:b8:34:ba:29 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:82:5f:f9:3d:70:31 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:50:31.048535 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.048525 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:50:31.048662 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.048650 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:50:31.049775 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.049749 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:50:31.049921 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.049777 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-146.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:50:31.049970 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.049930 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:50:31.049970 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.049939 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:50:31.049970 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.049956 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:50:31.050752 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.050741 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:50:31.052020 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.052010 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:50:31.052122 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.052113 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:50:31.054846 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.054837 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:50:31.054880 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.054857 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:50:31.054880 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.054870 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:50:31.054880 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.054881 2574 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:50:31.055000 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.054897 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:50:31.056069 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.056057 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:50:31.056117 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.056075 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:50:31.059158 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.059128 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:50:31.060370 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.060356 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:50:31.062059 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062046 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062064 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062070 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062076 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062082 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062087 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062093 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062099 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062106 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062113 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062125 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:50:31.062135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.062138 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:50:31.063406 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.063376 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:50:31.063532 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.063521 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:50:31.068009 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.067991 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:50:31.068109 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.068040 2574 server.go:1295] "Started kubelet" Apr 23 08:50:31.068109 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.068085 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-146.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:50:31.068211 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.068147 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:50:31.068211 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.068145 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:50:31.068211 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.068189 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-146.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:50:31.068211 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.068206 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:50:31.068365 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.068218 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:50:31.068964 ip-10-0-136-146 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:50:31.069530 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.069212 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:50:31.070248 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.070235 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:50:31.074754 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.074719 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:50:31.075122 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.075105 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:50:31.076081 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.075913 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:50:31.076081 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.075914 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:50:31.076261 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.076095 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:50:31.076322 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.076286 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:50:31.076322 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.076290 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.076322 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.076297 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:50:31.076451 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.075125 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-146.ec2.internal.18a8f044c00a5cc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-146.ec2.internal,UID:ip-10-0-136-146.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-146.ec2.internal,},FirstTimestamp:2026-04-23 08:50:31.068007625 +0000 UTC m=+0.460910667,LastTimestamp:2026-04-23 08:50:31.068007625 +0000 UTC m=+0.460910667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-146.ec2.internal,}" Apr 23 08:50:31.078420 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.078402 2574 factory.go:55] Registering systemd factory Apr 23 08:50:31.078822 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.078802 2574 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:50:31.079586 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.079571 2574 factory.go:153] Registering CRI-O factory Apr 23 08:50:31.079691 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.079673 2574 factory.go:223] Registration of the crio container factory successfully Apr 23 08:50:31.079826 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.079809 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:50:31.079880 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.079843 2574 factory.go:103] Registering Raw factory Apr 23 08:50:31.079880 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.079860 2574 manager.go:1196] Started watching for new ooms in manager Apr 23 08:50:31.080256 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.080227 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:50:31.080256 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.080251 2574 manager.go:319] Starting recovery of all containers Apr 23 08:50:31.084015 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.083991 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 08:50:31.084112 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.084090 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-146.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 08:50:31.085287 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.085261 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wcgcr" Apr 23 08:50:31.090415 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.090392 2574 manager.go:324] Recovery completed Apr 23 08:50:31.092841 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.092823 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wcgcr" Apr 23 08:50:31.095076 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.095064 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:31.097291 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.097277 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:31.097349 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.097304 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:31.097349 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.097317 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:31.097867 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.097851 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:50:31.097867 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.097867 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:50:31.097970 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.097882 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:50:31.100848 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.100836 2574 policy_none.go:49] "None policy: Start" Apr 23 08:50:31.100890 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.100853 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:50:31.100890 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.100864 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:50:31.145298 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.145272 2574 manager.go:341] "Starting Device Plugin manager" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.145328 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.145341 2574 server.go:85] "Starting device plugin registration server" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.145589 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.145599 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.145682 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.145804 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.145818 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.146270 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:50:31.162045 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.146303 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.178533 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.178495 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:50:31.179654 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.179639 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:50:31.179804 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.179667 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:50:31.179804 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.179688 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:50:31.179804 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.179697 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:50:31.179804 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.179747 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:50:31.183580 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.183558 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:31.246042 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.245958 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:31.246871 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.246850 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:31.246969 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.246884 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:31.246969 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.246896 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:31.246969 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.246919 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.255850 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.255834 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.255897 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.255856 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-146.ec2.internal\": node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.279509 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.279486 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.280613 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.280596 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal"] Apr 23 08:50:31.280678 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.280669 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:31.281469 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.281452 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:31.281538 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.281480 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:31.281538 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.281491 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:31.283814 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.283802 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:31.283957 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.283942 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.284013 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.283971 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:31.284487 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.284471 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:31.284487 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.284479 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:31.284618 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.284505 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:31.284618 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.284519 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:31.284618 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.284563 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:31.284618 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.284584 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:31.286810 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.286793 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.286892 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.286821 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:31.287469 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.287454 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:31.287527 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.287483 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:31.287527 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.287497 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:31.304220 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.304200 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-146.ec2.internal\" not found" node="ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.307414 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.307399 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-146.ec2.internal\" not found" node="ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.378199 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.378173 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0cee8a4266df6bc7dea9113abde0fdea-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal\" (UID: \"0cee8a4266df6bc7dea9113abde0fdea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.378321 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.378206 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cee8a4266df6bc7dea9113abde0fdea-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal\" (UID: \"0cee8a4266df6bc7dea9113abde0fdea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.378321 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.378225 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e2a8f2ccfc105868594a8460dd5ad37-config\") pod \"kube-apiserver-proxy-ip-10-0-136-146.ec2.internal\" (UID: \"5e2a8f2ccfc105868594a8460dd5ad37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.380266 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.380252 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.478733 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.478694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0cee8a4266df6bc7dea9113abde0fdea-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal\" (UID: \"0cee8a4266df6bc7dea9113abde0fdea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.478733 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.478738 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cee8a4266df6bc7dea9113abde0fdea-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal\" (UID: \"0cee8a4266df6bc7dea9113abde0fdea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.478892 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.478782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cee8a4266df6bc7dea9113abde0fdea-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal\" (UID: \"0cee8a4266df6bc7dea9113abde0fdea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.478892 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.478803 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e2a8f2ccfc105868594a8460dd5ad37-config\") pod \"kube-apiserver-proxy-ip-10-0-136-146.ec2.internal\" (UID: \"5e2a8f2ccfc105868594a8460dd5ad37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.478892 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.478779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0cee8a4266df6bc7dea9113abde0fdea-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal\" (UID: \"0cee8a4266df6bc7dea9113abde0fdea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.478892 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.478843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e2a8f2ccfc105868594a8460dd5ad37-config\") pod \"kube-apiserver-proxy-ip-10-0-136-146.ec2.internal\" (UID: \"5e2a8f2ccfc105868594a8460dd5ad37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.480774 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.480760 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.581609 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.581551 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.606783 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.606747 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.611295 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.611274 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" Apr 23 08:50:31.682044 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.682002 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.782519 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.782487 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.883142 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.883043 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:31.980540 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.980508 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:50:31.981205 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:31.980651 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:50:31.983721 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:31.983684 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-146.ec2.internal\" not found" Apr 23 08:50:32.050330 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.050306 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:32.055253 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.055231 2574 apiserver.go:52] "Watching apiserver" Apr 23 08:50:32.064642 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.064624 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:50:32.064965 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.064947 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-qw8wp","openshift-cluster-node-tuning-operator/tuned-4469g","openshift-image-registry/node-ca-kk9bd","openshift-multus/multus-6sq69","openshift-multus/multus-additional-cni-plugins-7mkzw","openshift-multus/network-metrics-daemon-hfh7w","openshift-network-diagnostics/network-check-target-5f7d6","openshift-ovn-kubernetes/ovnkube-node-2qbpw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6","openshift-network-operator/iptables-alerter-dtwjh"] Apr 23 08:50:32.069767 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.069748 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:32.069847 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.069814 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.072335 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.072147 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5fj2c\"" Apr 23 08:50:32.072335 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.072183 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qvdn9\"" Apr 23 08:50:32.072335 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.072234 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:50:32.072554 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.072506 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:50:32.073131 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.073108 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:32.073221 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.073171 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:32.074833 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.074819 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:50:32.075343 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.075328 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.075410 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.075397 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.075539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.075514 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" Apr 23 08:50:32.077420 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.077400 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:50:32.077525 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.077475 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:50:32.077693 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.077677 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:50:32.077693 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.077687 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:50:32.077927 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.077916 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:50:32.078800 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.078778 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:50:32.078898 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.078811 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2snlq\"" Apr 23 08:50:32.078898 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.078779 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ggpnv\"" Apr 23 08:50:32.078898 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.078819 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:50:32.079769 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.079750 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.079848 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.079789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:32.080003 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.079977 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:32.081719 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-lib-modules\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.081815 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081731 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-tuned\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.081815 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081779 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-system-cni-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.081918 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-conf-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.081918 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081843 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-system-cni-dir\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.081918 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/280929f6-fbfd-40eb-8b83-01a18c96fa3f-agent-certs\") pod \"konnectivity-agent-qw8wp\" (UID: \"280929f6-fbfd-40eb-8b83-01a18c96fa3f\") " pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:32.081918 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081875 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lghp9\"" Apr 23 08:50:32.081918 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtfm5\" (UniqueName: \"kubernetes.io/projected/23d44ac8-ae42-4654-8139-0c9ae73fb124-kube-api-access-wtfm5\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.081918 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d8ffaf53-084c-43ec-9bde-51d70f29f38b-serviceca\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.081957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbql\" (UniqueName: \"kubernetes.io/projected/d8ffaf53-084c-43ec-9bde-51d70f29f38b-kube-api-access-fsbql\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-socket-dir-parent\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-hostroot\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghnv\" (UniqueName: \"kubernetes.io/projected/21eeab6f-10d8-432b-aeab-0166ad5410c3-kube-api-access-mghnv\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-systemd\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23d44ac8-ae42-4654-8139-0c9ae73fb124-tmp\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-cni-binary-copy\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-k8s-cni-cncf-io\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082148 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:50:32.082213 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082151 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-cni-bin\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-multus-certs\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082317 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.082364 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-cni-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-cnibin\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-cni-multus\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-run\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysconfig\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082530 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-kubernetes\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysctl-d\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-host\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ffaf53-084c-43ec-9bde-51d70f29f38b-host\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.082655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-netns\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-daemon-config\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-os-release\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-kubelet\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-etc-kubernetes\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082777 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtxht\" (UniqueName: \"kubernetes.io/projected/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-kube-api-access-wtxht\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082809 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-os-release\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-sys\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/280929f6-fbfd-40eb-8b83-01a18c96fa3f-konnectivity-ca\") pod \"konnectivity-agent-qw8wp\" (UID: \"280929f6-fbfd-40eb-8b83-01a18c96fa3f\") " pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-modprobe-d\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082896 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysctl-conf\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-var-lib-kubelet\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-cnibin\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.083168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.082952 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.084438 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.084420 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:50:32.085357 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.085315 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.087535 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.087517 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:50:32.088474 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.088452 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.088881 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.088855 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:50:32.089024 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.088887 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:50:32.089846 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.089827 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:50:32.089969 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.089886 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:50:32.089969 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.089905 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d7sbx\"" Apr 23 08:50:32.089969 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.089922 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:50:32.090365 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.090350 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:50:32.090548 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.090532 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:50:32.090941 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.090918 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal"] Apr 23 08:50:32.091014 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.090990 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:50:32.091055 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.091014 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.091088 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.091014 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j4qxb\"" Apr 23 08:50:32.091436 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.091416 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:50:32.091509 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.091485 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" Apr 23 08:50:32.093144 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.093117 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:50:32.093144 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.093137 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:32.093298 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.093267 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:32.093947 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.093932 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mphhw\"" Apr 23 08:50:32.094625 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.094572 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:45:31 +0000 UTC" deadline="2027-10-23 11:03:02.556920682 +0000 UTC" Apr 23 08:50:32.094693 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.094625 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13154h12m30.462299328s" Apr 23 08:50:32.094693 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.094670 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:32.099834 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.099816 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:50:32.099907 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.099845 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal"] Apr 23 08:50:32.109478 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.109459 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-z6x8h" Apr 23 08:50:32.120814 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.120785 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-z6x8h" Apr 23 08:50:32.177449 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.177431 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:50:32.183689 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183667 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-device-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.183836 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-lib-modules\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.183836 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183731 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-etc-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.183836 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-log-socket\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.183836 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-registration-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.183836 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183787 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t574\" (UniqueName: \"kubernetes.io/projected/8134e095-58a2-4e24-a2fb-a39cfb902acf-kube-api-access-2t574\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.183836 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovnkube-script-lib\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsrzh\" (UniqueName: \"kubernetes.io/projected/e2bc1947-82a8-463d-9645-47c17b4bb97d-kube-api-access-bsrzh\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-lib-modules\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/280929f6-fbfd-40eb-8b83-01a18c96fa3f-agent-certs\") pod \"konnectivity-agent-qw8wp\" (UID: \"280929f6-fbfd-40eb-8b83-01a18c96fa3f\") " pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtfm5\" (UniqueName: \"kubernetes.io/projected/23d44ac8-ae42-4654-8139-0c9ae73fb124-kube-api-access-wtfm5\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mghnv\" (UniqueName: \"kubernetes.io/projected/21eeab6f-10d8-432b-aeab-0166ad5410c3-kube-api-access-mghnv\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183943 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-var-lib-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.183981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23d44ac8-ae42-4654-8139-0c9ae73fb124-tmp\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-k8s-cni-cncf-io\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184093 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tcrk\" (UniqueName: \"kubernetes.io/projected/47f515b8-3d0e-4a31-898f-c3738e20428a-kube-api-access-7tcrk\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:32.184129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-kubelet\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-cni-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-k8s-cni-cncf-io\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysconfig\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-daemon-config\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-os-release\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysconfig\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184282 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184199 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-systemd\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184222 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-cni-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-os-release\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184404 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-kubelet\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-kubelet\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:32.184667 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184493 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-run-netns\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-socket-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-modprobe-d\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-var-lib-kubelet\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-modprobe-d\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-cnibin\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184751 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-cnibin\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-var-lib-kubelet\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-node-log\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-tuned\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184826 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-system-cni-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-daemon-config\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-conf-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8134e095-58a2-4e24-a2fb-a39cfb902acf-host-slash\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-system-cni-dir\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.185539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184908 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-system-cni-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184933 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-sys-fs\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184951 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-conf-dir\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d8ffaf53-084c-43ec-9bde-51d70f29f38b-serviceca\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184982 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-system-cni-dir\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.184986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbql\" (UniqueName: \"kubernetes.io/projected/d8ffaf53-084c-43ec-9bde-51d70f29f38b-kube-api-access-fsbql\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-socket-dir-parent\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-hostroot\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-systemd\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-multus-socket-dir-parent\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-cni-binary-copy\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-cni-bin\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-multus-certs\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-systemd\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.186504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185319 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d8ffaf53-084c-43ec-9bde-51d70f29f38b-serviceca\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185364 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-hostroot\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185381 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-cni-bin\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-multus-certs\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-cni-binary-copy\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185907 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-systemd-units\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185935 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-ovn\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-run-ovn-kubernetes\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.185998 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-cnibin\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-cni-multus\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186080 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8134e095-58a2-4e24-a2fb-a39cfb902acf-iptables-alerter-script\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-var-lib-cni-multus\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-run\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186139 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-kubernetes\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186194 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-run\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.187320 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186217 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21eeab6f-10d8-432b-aeab-0166ad5410c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysctl-d\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-host\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186292 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-cnibin\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysctl-d\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ffaf53-084c-43ec-9bde-51d70f29f38b-host\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-kubernetes\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-netns\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-etc-kubernetes\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/21eeab6f-10d8-432b-aeab-0166ad5410c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtxht\" (UniqueName: \"kubernetes.io/projected/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-kube-api-access-wtxht\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186500 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-os-release\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186541 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ffaf53-084c-43ec-9bde-51d70f29f38b-host\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-host-run-netns\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-host\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186794 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-etc-kubernetes\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186865 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-slash\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-os-release\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.187902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186895 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-cni-bin\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovnkube-config\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovn-node-metrics-cert\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7n62\" (UniqueName: \"kubernetes.io/projected/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-kube-api-access-j7n62\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.186974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-sys\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187006 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/280929f6-fbfd-40eb-8b83-01a18c96fa3f-konnectivity-ca\") pod \"konnectivity-agent-qw8wp\" (UID: \"280929f6-fbfd-40eb-8b83-01a18c96fa3f\") " pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysctl-conf\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-sys\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-cni-netd\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-env-overrides\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187194 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-sysctl-conf\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23d44ac8-ae42-4654-8139-0c9ae73fb124-tmp\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/280929f6-fbfd-40eb-8b83-01a18c96fa3f-konnectivity-ca\") pod \"konnectivity-agent-qw8wp\" (UID: \"280929f6-fbfd-40eb-8b83-01a18c96fa3f\") " pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23d44ac8-ae42-4654-8139-0c9ae73fb124-etc-tuned\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.188384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.187745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/280929f6-fbfd-40eb-8b83-01a18c96fa3f-agent-certs\") pod \"konnectivity-agent-qw8wp\" (UID: \"280929f6-fbfd-40eb-8b83-01a18c96fa3f\") " pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:32.192280 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.192256 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtfm5\" (UniqueName: \"kubernetes.io/projected/23d44ac8-ae42-4654-8139-0c9ae73fb124-kube-api-access-wtfm5\") pod \"tuned-4469g\" (UID: \"23d44ac8-ae42-4654-8139-0c9ae73fb124\") " pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.192556 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.192538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghnv\" (UniqueName: \"kubernetes.io/projected/21eeab6f-10d8-432b-aeab-0166ad5410c3-kube-api-access-mghnv\") pod \"multus-additional-cni-plugins-7mkzw\" (UID: \"21eeab6f-10d8-432b-aeab-0166ad5410c3\") " pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.192597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.192538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbql\" (UniqueName: \"kubernetes.io/projected/d8ffaf53-084c-43ec-9bde-51d70f29f38b-kube-api-access-fsbql\") pod \"node-ca-kk9bd\" (UID: \"d8ffaf53-084c-43ec-9bde-51d70f29f38b\") " pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.193486 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.193466 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtxht\" (UniqueName: \"kubernetes.io/projected/33b4ebcd-fa1d-434d-b23c-b9216777b5a2-kube-api-access-wtxht\") pod \"multus-6sq69\" (UID: \"33b4ebcd-fa1d-434d-b23c-b9216777b5a2\") " pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.224799 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.224753 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cee8a4266df6bc7dea9113abde0fdea.slice/crio-98e14217099de306692a53cd8143a027e19548fb6cc2f08df0ec23605dc36298 WatchSource:0}: Error finding container 98e14217099de306692a53cd8143a027e19548fb6cc2f08df0ec23605dc36298: Status 404 returned error can't find the container with id 98e14217099de306692a53cd8143a027e19548fb6cc2f08df0ec23605dc36298 Apr 23 08:50:32.230927 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.230909 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:50:32.234609 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.234572 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2a8f2ccfc105868594a8460dd5ad37.slice/crio-52950547386bb7c0bde739b57e8f899dcc253cacc9b126572ecd6fca869197ab WatchSource:0}: Error finding container 52950547386bb7c0bde739b57e8f899dcc253cacc9b126572ecd6fca869197ab: Status 404 returned error can't find the container with id 52950547386bb7c0bde739b57e8f899dcc253cacc9b126572ecd6fca869197ab Apr 23 08:50:32.281688 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.281652 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:32.287376 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-etc-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.287478 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-log-socket\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.287478 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287400 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-registration-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.287478 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t574\" (UniqueName: \"kubernetes.io/projected/8134e095-58a2-4e24-a2fb-a39cfb902acf-kube-api-access-2t574\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.287605 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-log-socket\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.287605 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-registration-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.287605 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287539 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovnkube-script-lib\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.287605 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsrzh\" (UniqueName: \"kubernetes.io/projected/e2bc1947-82a8-463d-9645-47c17b4bb97d-kube-api-access-bsrzh\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.287799 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287600 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-etc-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.287799 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-var-lib-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.287799 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.287799 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tcrk\" (UniqueName: \"kubernetes.io/projected/47f515b8-3d0e-4a31-898f-c3738e20428a-kube-api-access-7tcrk\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:32.287799 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-var-lib-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.287799 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-kubelet\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-kubelet\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-systemd\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.287972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288000 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.288009 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288026 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-run-netns\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288036 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-openvswitch\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288001 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-systemd\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.288071 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:50:32.788041131 +0000 UTC m=+2.180944162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-run-netns\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-socket-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-node-log\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288180 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8134e095-58a2-4e24-a2fb-a39cfb902acf-host-slash\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-socket-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-sys-fs\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288209 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-node-log\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-sys-fs\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288199 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovnkube-script-lib\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288250 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8134e095-58a2-4e24-a2fb-a39cfb902acf-host-slash\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.288389 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288319 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-systemd-units\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-ovn\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288368 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-run-ovn-kubernetes\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-systemd-units\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8134e095-58a2-4e24-a2fb-a39cfb902acf-iptables-alerter-script\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-run-ovn\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288420 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-slash\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-cni-bin\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovnkube-config\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-slash\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-cni-bin\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-run-ovn-kubernetes\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovn-node-metrics-cert\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7n62\" (UniqueName: \"kubernetes.io/projected/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-kube-api-access-j7n62\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-cni-netd\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-env-overrides\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289006 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-device-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.289552 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288666 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-host-cni-netd\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289552 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288673 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2bc1947-82a8-463d-9645-47c17b4bb97d-device-dir\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.289552 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.288956 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovnkube-config\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.289552 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.289075 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8134e095-58a2-4e24-a2fb-a39cfb902acf-iptables-alerter-script\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.289552 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.289098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-env-overrides\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.290733 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.290698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-ovn-node-metrics-cert\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.294166 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.294150 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:32.294235 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.294171 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:32.294235 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.294183 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jdtws for pod openshift-network-diagnostics/network-check-target-5f7d6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:32.294335 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.294259 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws podName:44958f50-5d35-4dcd-831d-1140d11671e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:32.794242901 +0000 UTC m=+2.187145954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jdtws" (UniqueName: "kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws") pod "network-check-target-5f7d6" (UID: "44958f50-5d35-4dcd-831d-1140d11671e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:32.296223 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.296204 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsrzh\" (UniqueName: \"kubernetes.io/projected/e2bc1947-82a8-463d-9645-47c17b4bb97d-kube-api-access-bsrzh\") pod \"aws-ebs-csi-driver-node-5l8z6\" (UID: \"e2bc1947-82a8-463d-9645-47c17b4bb97d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.296330 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.296258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t574\" (UniqueName: \"kubernetes.io/projected/8134e095-58a2-4e24-a2fb-a39cfb902acf-kube-api-access-2t574\") pod \"iptables-alerter-dtwjh\" (UID: \"8134e095-58a2-4e24-a2fb-a39cfb902acf\") " pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.296947 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.296931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7n62\" (UniqueName: \"kubernetes.io/projected/6dbd9100-2dd7-4450-a0e7-2f86e96b3487-kube-api-access-j7n62\") pod \"ovnkube-node-2qbpw\" (UID: \"6dbd9100-2dd7-4450-a0e7-2f86e96b3487\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.297114 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.297098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tcrk\" (UniqueName: \"kubernetes.io/projected/47f515b8-3d0e-4a31-898f-c3738e20428a-kube-api-access-7tcrk\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:32.390616 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.390593 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:32.397242 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.397181 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280929f6_fbfd_40eb_8b83_01a18c96fa3f.slice/crio-47343d4eb3df91c3f7f42701befb89c287c2b466f1b6afe4ed11c4a87f206d6c WatchSource:0}: Error finding container 47343d4eb3df91c3f7f42701befb89c287c2b466f1b6afe4ed11c4a87f206d6c: Status 404 returned error can't find the container with id 47343d4eb3df91c3f7f42701befb89c287c2b466f1b6afe4ed11c4a87f206d6c Apr 23 08:50:32.411549 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.411519 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4469g" Apr 23 08:50:32.415066 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.415041 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kk9bd" Apr 23 08:50:32.417421 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.417398 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23d44ac8_ae42_4654_8139_0c9ae73fb124.slice/crio-00aa0d0f580dad7cfbbd1e94e60cec16731e3ee7bab1a361f27cfb3cf5083e8a WatchSource:0}: Error finding container 00aa0d0f580dad7cfbbd1e94e60cec16731e3ee7bab1a361f27cfb3cf5083e8a: Status 404 returned error can't find the container with id 00aa0d0f580dad7cfbbd1e94e60cec16731e3ee7bab1a361f27cfb3cf5083e8a Apr 23 08:50:32.422536 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.422515 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8ffaf53_084c_43ec_9bde_51d70f29f38b.slice/crio-e4cd932acf94b7663b10bda15a311c0d20b8e0602ab1b9b760696f85a6b9186e WatchSource:0}: Error finding container e4cd932acf94b7663b10bda15a311c0d20b8e0602ab1b9b760696f85a6b9186e: Status 404 returned error can't find the container with id e4cd932acf94b7663b10bda15a311c0d20b8e0602ab1b9b760696f85a6b9186e Apr 23 08:50:32.428745 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.428726 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6sq69" Apr 23 08:50:32.435588 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.435570 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b4ebcd_fa1d_434d_b23c_b9216777b5a2.slice/crio-8b0847645b606f0a08b1210e796becf3ed46e5894a8bd9488d2dfd9fde629f39 WatchSource:0}: Error finding container 8b0847645b606f0a08b1210e796becf3ed46e5894a8bd9488d2dfd9fde629f39: Status 404 returned error can't find the container with id 8b0847645b606f0a08b1210e796becf3ed46e5894a8bd9488d2dfd9fde629f39 Apr 23 08:50:32.443859 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.443842 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" Apr 23 08:50:32.450887 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.450866 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21eeab6f_10d8_432b_aeab_0166ad5410c3.slice/crio-80914e77a2dda03ad81426577afd928168110da86de85275a3cd495eff0c362a WatchSource:0}: Error finding container 80914e77a2dda03ad81426577afd928168110da86de85275a3cd495eff0c362a: Status 404 returned error can't find the container with id 80914e77a2dda03ad81426577afd928168110da86de85275a3cd495eff0c362a Apr 23 08:50:32.461404 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.461387 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:32.465940 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.465925 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" Apr 23 08:50:32.466667 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.466644 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dbd9100_2dd7_4450_a0e7_2f86e96b3487.slice/crio-58daf76a4d6e29a05e849bc1eb9c5b5beb50e8b6b2b768cb17afb0b1d2549a4a WatchSource:0}: Error finding container 58daf76a4d6e29a05e849bc1eb9c5b5beb50e8b6b2b768cb17afb0b1d2549a4a: Status 404 returned error can't find the container with id 58daf76a4d6e29a05e849bc1eb9c5b5beb50e8b6b2b768cb17afb0b1d2549a4a Apr 23 08:50:32.471063 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.471043 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dtwjh" Apr 23 08:50:32.471476 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.471438 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2bc1947_82a8_463d_9645_47c17b4bb97d.slice/crio-d409efce64a477b23ce0cbf525be7bda0d933a93ff08a6a4e06ef3fd26dd4721 WatchSource:0}: Error finding container d409efce64a477b23ce0cbf525be7bda0d933a93ff08a6a4e06ef3fd26dd4721: Status 404 returned error can't find the container with id d409efce64a477b23ce0cbf525be7bda0d933a93ff08a6a4e06ef3fd26dd4721 Apr 23 08:50:32.477273 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:50:32.477255 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8134e095_58a2_4e24_a2fb_a39cfb902acf.slice/crio-9cabaacde654a3d853ed02834aaeec90a427ac02d22b78b32828eae3d34f89f5 WatchSource:0}: Error finding container 9cabaacde654a3d853ed02834aaeec90a427ac02d22b78b32828eae3d34f89f5: Status 404 returned error can't find the container with id 9cabaacde654a3d853ed02834aaeec90a427ac02d22b78b32828eae3d34f89f5 Apr 23 08:50:32.794806 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.793726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:32.794806 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.794234 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:32.794806 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.794302 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:50:33.794282095 +0000 UTC m=+3.187185126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:32.894813 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.894778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:32.895035 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.894927 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:32.895035 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.894948 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:32.895035 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.894960 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jdtws for pod openshift-network-diagnostics/network-check-target-5f7d6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:32.895035 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:32.895031 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws podName:44958f50-5d35-4dcd-831d-1140d11671e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:33.89500045 +0000 UTC m=+3.287903494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdtws" (UniqueName: "kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws") pod "network-check-target-5f7d6" (UID: "44958f50-5d35-4dcd-831d-1140d11671e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:32.988143 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:32.988117 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:33.123445 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.121850 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:45:32 +0000 UTC" deadline="2028-01-25 05:09:47.457046904 +0000 UTC" Apr 23 08:50:33.123445 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.121890 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15404h19m14.335161011s" Apr 23 08:50:33.196231 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.196172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dtwjh" event={"ID":"8134e095-58a2-4e24-a2fb-a39cfb902acf","Type":"ContainerStarted","Data":"9cabaacde654a3d853ed02834aaeec90a427ac02d22b78b32828eae3d34f89f5"} Apr 23 08:50:33.216838 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.216801 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" event={"ID":"e2bc1947-82a8-463d-9645-47c17b4bb97d","Type":"ContainerStarted","Data":"d409efce64a477b23ce0cbf525be7bda0d933a93ff08a6a4e06ef3fd26dd4721"} Apr 23 08:50:33.226958 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.226925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6sq69" event={"ID":"33b4ebcd-fa1d-434d-b23c-b9216777b5a2","Type":"ContainerStarted","Data":"8b0847645b606f0a08b1210e796becf3ed46e5894a8bd9488d2dfd9fde629f39"} Apr 23 08:50:33.235665 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.235624 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kk9bd" event={"ID":"d8ffaf53-084c-43ec-9bde-51d70f29f38b","Type":"ContainerStarted","Data":"e4cd932acf94b7663b10bda15a311c0d20b8e0602ab1b9b760696f85a6b9186e"} Apr 23 08:50:33.239866 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.239812 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4469g" event={"ID":"23d44ac8-ae42-4654-8139-0c9ae73fb124","Type":"ContainerStarted","Data":"00aa0d0f580dad7cfbbd1e94e60cec16731e3ee7bab1a361f27cfb3cf5083e8a"} Apr 23 08:50:33.259369 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.259339 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qw8wp" event={"ID":"280929f6-fbfd-40eb-8b83-01a18c96fa3f","Type":"ContainerStarted","Data":"47343d4eb3df91c3f7f42701befb89c287c2b466f1b6afe4ed11c4a87f206d6c"} Apr 23 08:50:33.260938 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.260911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" event={"ID":"5e2a8f2ccfc105868594a8460dd5ad37","Type":"ContainerStarted","Data":"52950547386bb7c0bde739b57e8f899dcc253cacc9b126572ecd6fca869197ab"} Apr 23 08:50:33.271164 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.271129 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" event={"ID":"0cee8a4266df6bc7dea9113abde0fdea","Type":"ContainerStarted","Data":"98e14217099de306692a53cd8143a027e19548fb6cc2f08df0ec23605dc36298"} Apr 23 08:50:33.280157 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.278950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"58daf76a4d6e29a05e849bc1eb9c5b5beb50e8b6b2b768cb17afb0b1d2549a4a"} Apr 23 08:50:33.280157 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.280149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerStarted","Data":"80914e77a2dda03ad81426577afd928168110da86de85275a3cd495eff0c362a"} Apr 23 08:50:33.801187 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.801137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:33.801352 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:33.801275 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:33.801352 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:33.801339 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:50:35.801320712 +0000 UTC m=+5.194223745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:33.901963 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:33.901922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:33.902140 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:33.902103 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:33.902140 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:33.902121 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:33.902140 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:33.902133 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jdtws for pod openshift-network-diagnostics/network-check-target-5f7d6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:33.902301 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:33.902190 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws podName:44958f50-5d35-4dcd-831d-1140d11671e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:35.902172169 +0000 UTC m=+5.295075216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdtws" (UniqueName: "kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws") pod "network-check-target-5f7d6" (UID: "44958f50-5d35-4dcd-831d-1140d11671e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:34.122978 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:34.122889 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:45:32 +0000 UTC" deadline="2027-09-29 03:29:51.064401266 +0000 UTC" Apr 23 08:50:34.122978 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:34.122932 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12570h39m16.941473824s" Apr 23 08:50:34.182090 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:34.182056 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:34.182273 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:34.182185 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:34.182701 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:34.182678 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:34.182845 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:34.182807 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:34.974186 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:34.973398 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rm7wx"] Apr 23 08:50:34.976870 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:34.976387 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:34.976870 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:34.976472 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:35.011000 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.010954 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.011199 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.011090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/51448c01-b78f-45a0-89ef-99a6d2c0613c-dbus\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.011199 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.011143 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/51448c01-b78f-45a0-89ef-99a6d2c0613c-kubelet-config\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.112917 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.112125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/51448c01-b78f-45a0-89ef-99a6d2c0613c-dbus\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.112917 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.112181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/51448c01-b78f-45a0-89ef-99a6d2c0613c-kubelet-config\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.112917 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.112223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.112917 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.112383 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:35.112917 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.112446 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret podName:51448c01-b78f-45a0-89ef-99a6d2c0613c nodeName:}" failed. No retries permitted until 2026-04-23 08:50:35.612426264 +0000 UTC m=+5.005329309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret") pod "global-pull-secret-syncer-rm7wx" (UID: "51448c01-b78f-45a0-89ef-99a6d2c0613c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:35.112917 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.112809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/51448c01-b78f-45a0-89ef-99a6d2c0613c-dbus\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.112917 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.112872 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/51448c01-b78f-45a0-89ef-99a6d2c0613c-kubelet-config\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.617117 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.617073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:35.617584 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.617233 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:35.617584 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.617296 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret podName:51448c01-b78f-45a0-89ef-99a6d2c0613c nodeName:}" failed. No retries permitted until 2026-04-23 08:50:36.617277393 +0000 UTC m=+6.010180426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret") pod "global-pull-secret-syncer-rm7wx" (UID: "51448c01-b78f-45a0-89ef-99a6d2c0613c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:35.819756 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.819720 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:35.819940 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.819879 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:35.819940 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.819939 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:50:39.819921754 +0000 UTC m=+9.212824799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:35.921522 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:35.921069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:35.921522 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.921258 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:35.921522 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.921276 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:35.921522 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.921290 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jdtws for pod openshift-network-diagnostics/network-check-target-5f7d6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:35.921522 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:35.921362 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws podName:44958f50-5d35-4dcd-831d-1140d11671e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:39.921342189 +0000 UTC m=+9.314245223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdtws" (UniqueName: "kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws") pod "network-check-target-5f7d6" (UID: "44958f50-5d35-4dcd-831d-1140d11671e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:36.180734 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:36.180639 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:36.180904 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:36.180794 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:36.181496 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:36.181191 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:36.181496 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:36.181299 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:36.181496 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:36.181349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:36.181496 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:36.181412 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:36.626842 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:36.626760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:36.627270 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:36.626947 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:36.627270 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:36.627019 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret podName:51448c01-b78f-45a0-89ef-99a6d2c0613c nodeName:}" failed. No retries permitted until 2026-04-23 08:50:38.627000501 +0000 UTC m=+8.019903532 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret") pod "global-pull-secret-syncer-rm7wx" (UID: "51448c01-b78f-45a0-89ef-99a6d2c0613c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:38.180376 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:38.180340 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:38.180872 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:38.180475 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:38.180872 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:38.180574 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:38.180872 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:38.180589 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:38.180872 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:38.180653 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:38.180872 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:38.180750 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:38.641680 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:38.641555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:38.641880 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:38.641699 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:38.641880 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:38.641830 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret podName:51448c01-b78f-45a0-89ef-99a6d2c0613c nodeName:}" failed. No retries permitted until 2026-04-23 08:50:42.641809091 +0000 UTC m=+12.034712137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret") pod "global-pull-secret-syncer-rm7wx" (UID: "51448c01-b78f-45a0-89ef-99a6d2c0613c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:39.852437 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:39.852398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:39.852944 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:39.852530 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:39.852944 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:39.852608 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:50:47.852584908 +0000 UTC m=+17.245487958 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:39.953824 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:39.953730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:39.954015 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:39.953924 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:39.954015 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:39.953948 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:39.954015 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:39.953961 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jdtws for pod openshift-network-diagnostics/network-check-target-5f7d6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:39.954177 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:39.954030 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws podName:44958f50-5d35-4dcd-831d-1140d11671e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:47.954009261 +0000 UTC m=+17.346912311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdtws" (UniqueName: "kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws") pod "network-check-target-5f7d6" (UID: "44958f50-5d35-4dcd-831d-1140d11671e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:40.180596 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:40.180496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:40.180982 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:40.180633 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:40.180982 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:40.180496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:40.180982 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:40.180496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:40.180982 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:40.180765 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:40.180982 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:40.180867 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:42.180105 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:42.180069 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:42.180533 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:42.180071 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:42.180533 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:42.180203 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:42.180533 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:42.180311 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:42.180533 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:42.180070 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:42.180533 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:42.180409 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:42.672005 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:42.671960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:42.672203 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:42.672126 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:42.672268 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:42.672210 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret podName:51448c01-b78f-45a0-89ef-99a6d2c0613c nodeName:}" failed. No retries permitted until 2026-04-23 08:50:50.672192442 +0000 UTC m=+20.065095474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret") pod "global-pull-secret-syncer-rm7wx" (UID: "51448c01-b78f-45a0-89ef-99a6d2c0613c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:44.180725 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:44.180682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:44.181172 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:44.180686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:44.181172 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:44.180814 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:44.181172 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:44.180926 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:44.181172 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:44.180686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:44.181172 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:44.181017 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:46.180110 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.180073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:46.180558 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.180073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:46.180558 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:46.180197 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:46.180558 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.180073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:46.180558 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:46.180278 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:46.180558 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:46.180351 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:46.554984 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.554949 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-th9nj"] Apr 23 08:50:46.561758 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.561703 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.564364 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.564342 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fjfk4\"" Apr 23 08:50:46.564530 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.564342 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:50:46.564530 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.564502 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:50:46.597665 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.597632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b62d5\" (UniqueName: \"kubernetes.io/projected/c65f288c-6f59-486d-a1fb-454d54bcb237-kube-api-access-b62d5\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.597818 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.597671 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c65f288c-6f59-486d-a1fb-454d54bcb237-tmp-dir\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.597818 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.597692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c65f288c-6f59-486d-a1fb-454d54bcb237-hosts-file\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.698529 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.698482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b62d5\" (UniqueName: \"kubernetes.io/projected/c65f288c-6f59-486d-a1fb-454d54bcb237-kube-api-access-b62d5\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.698731 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.698544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c65f288c-6f59-486d-a1fb-454d54bcb237-tmp-dir\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.698731 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.698568 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c65f288c-6f59-486d-a1fb-454d54bcb237-hosts-file\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.698731 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.698680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c65f288c-6f59-486d-a1fb-454d54bcb237-hosts-file\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.698974 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.698950 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c65f288c-6f59-486d-a1fb-454d54bcb237-tmp-dir\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.708342 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.708311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b62d5\" (UniqueName: \"kubernetes.io/projected/c65f288c-6f59-486d-a1fb-454d54bcb237-kube-api-access-b62d5\") pod \"node-resolver-th9nj\" (UID: \"c65f288c-6f59-486d-a1fb-454d54bcb237\") " pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:46.872291 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:46.872207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-th9nj" Apr 23 08:50:47.906406 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:47.906371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:47.906851 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:47.906525 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:47.906851 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:47.906587 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:51:03.906572248 +0000 UTC m=+33.299475283 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:48.007442 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:48.007405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:48.007627 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:48.007585 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:48.007627 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:48.007608 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:48.007627 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:48.007618 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jdtws for pod openshift-network-diagnostics/network-check-target-5f7d6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:48.007802 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:48.007678 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws podName:44958f50-5d35-4dcd-831d-1140d11671e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:04.007660234 +0000 UTC m=+33.400563283 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdtws" (UniqueName: "kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws") pod "network-check-target-5f7d6" (UID: "44958f50-5d35-4dcd-831d-1140d11671e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:48.180735 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:48.180642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:48.180735 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:48.180679 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:48.180935 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:48.180642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:48.180935 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:48.180792 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:48.180935 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:48.180866 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:48.180935 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:48.180927 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:50.180780 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:50.180736 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:50.181221 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:50.180786 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:50.181221 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:50.180872 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:50.181221 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:50.180885 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:50.181221 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:50.180962 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:50.181221 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:50.181019 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:50.310114 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:50.310078 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4469g" event={"ID":"23d44ac8-ae42-4654-8139-0c9ae73fb124","Type":"ContainerStarted","Data":"ce917c1229f0e39535654a29393418669ea176755ff517cfbf6259b560705ed9"} Apr 23 08:50:50.313061 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:50.313030 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"fd0c3cbded621c4f33cee069a6684702811955d5e3959c8cef9431dce9adbd9e"} Apr 23 08:50:50.314260 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:50.314234 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-th9nj" event={"ID":"c65f288c-6f59-486d-a1fb-454d54bcb237","Type":"ContainerStarted","Data":"aac18ae39237430f3db776a125fe38835fe9d48d39c6ff852c74783bf821ad46"} Apr 23 08:50:50.325444 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:50.325386 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4469g" podStartSLOduration=1.7533794029999998 podStartE2EDuration="19.325371245s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:50:32.419343461 +0000 UTC m=+1.812246492" lastFinishedPulling="2026-04-23 08:50:49.991335287 +0000 UTC m=+19.384238334" observedRunningTime="2026-04-23 08:50:50.325126334 +0000 UTC m=+19.718029390" watchObservedRunningTime="2026-04-23 08:50:50.325371245 +0000 UTC m=+19.718274297" Apr 23 08:50:50.728153 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:50.727888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:50.728290 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:50.728029 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:50.728290 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:50.728255 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret podName:51448c01-b78f-45a0-89ef-99a6d2c0613c nodeName:}" failed. No retries permitted until 2026-04-23 08:51:06.728235855 +0000 UTC m=+36.121138905 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret") pod "global-pull-secret-syncer-rm7wx" (UID: "51448c01-b78f-45a0-89ef-99a6d2c0613c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:50:51.319426 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.319389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" event={"ID":"e2bc1947-82a8-463d-9645-47c17b4bb97d","Type":"ContainerStarted","Data":"e94df15a5542d298708afeebbb89f291ffc55cf72095180934ee30e109f9bcb6"} Apr 23 08:50:51.320682 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.320653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6sq69" event={"ID":"33b4ebcd-fa1d-434d-b23c-b9216777b5a2","Type":"ContainerStarted","Data":"4979679e1399db5414e8caa609546ec2379af5053c5fc70bdd68b4c38c1c016d"} Apr 23 08:50:51.322063 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.322038 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kk9bd" event={"ID":"d8ffaf53-084c-43ec-9bde-51d70f29f38b","Type":"ContainerStarted","Data":"e97cf3740ec65de9450849df887fb51cb8f2aa5701788725178a05661db52c4e"} Apr 23 08:50:51.323423 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.323396 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qw8wp" event={"ID":"280929f6-fbfd-40eb-8b83-01a18c96fa3f","Type":"ContainerStarted","Data":"e20d9d6ef2f672f315c1ce172c0a540a71689d3d5240837fc3803bdf0bdeb68c"} Apr 23 08:50:51.324621 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.324594 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" event={"ID":"5e2a8f2ccfc105868594a8460dd5ad37","Type":"ContainerStarted","Data":"accfd0a5445795b317f0a149a9c5cade3aafccee7c2bb9f5ff89c8e4bf0e9557"} Apr 23 08:50:51.326002 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.325980 2574 generic.go:358] "Generic (PLEG): container finished" podID="0cee8a4266df6bc7dea9113abde0fdea" containerID="0985c7162303a9b8614d9483f1fb7ebaaafa0a471461dff64b7df1578d158996" exitCode=0 Apr 23 08:50:51.326117 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.326062 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" event={"ID":"0cee8a4266df6bc7dea9113abde0fdea","Type":"ContainerDied","Data":"0985c7162303a9b8614d9483f1fb7ebaaafa0a471461dff64b7df1578d158996"} Apr 23 08:50:51.328606 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.328587 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 08:50:51.328901 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.328884 2574 generic.go:358] "Generic (PLEG): container finished" podID="6dbd9100-2dd7-4450-a0e7-2f86e96b3487" containerID="5a0208c3a40815e654a5d402b607361def2b2879eb18196bcea74ecee5f0afa7" exitCode=1 Apr 23 08:50:51.328969 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.328942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"51f061a8061fb93ccf4d3721f8b2c78e7298aa119b32cab8b4924cc4b0f0d474"} Apr 23 08:50:51.328969 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.328960 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"a47c989726ed72e466f03c90255c9d439da921d957c0cf4063df9b755a43e2b5"} Apr 23 08:50:51.329055 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.328971 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"e16a4476b1aca74890266a11ffe52ef3c5175ad649cf14b252dfa2bef85a33fc"} Apr 23 08:50:51.329055 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.328980 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"cdc9ee49e932d2c8bc72c4bb629326820d6abc0ab88443735cbad0443367fff6"} Apr 23 08:50:51.329055 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.328988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerDied","Data":"5a0208c3a40815e654a5d402b607361def2b2879eb18196bcea74ecee5f0afa7"} Apr 23 08:50:51.330207 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.330188 2574 generic.go:358] "Generic (PLEG): container finished" podID="21eeab6f-10d8-432b-aeab-0166ad5410c3" containerID="16bcc8eb61ef1c0b01b3f27c7d6e2b140f90eeb5d3cc903b9219b8449199c9d0" exitCode=0 Apr 23 08:50:51.330294 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.330258 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerDied","Data":"16bcc8eb61ef1c0b01b3f27c7d6e2b140f90eeb5d3cc903b9219b8449199c9d0"} Apr 23 08:50:51.331380 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.331356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-th9nj" event={"ID":"c65f288c-6f59-486d-a1fb-454d54bcb237","Type":"ContainerStarted","Data":"80041019d41d97edbfdf3cec13057be9b9aa22212b9ebf73c0151e54ee6eb7cf"} Apr 23 08:50:51.350337 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.350291 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-146.ec2.internal" podStartSLOduration=19.350275171 podStartE2EDuration="19.350275171s" podCreationTimestamp="2026-04-23 08:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:51.349848358 +0000 UTC m=+20.742751410" watchObservedRunningTime="2026-04-23 08:50:51.350275171 +0000 UTC m=+20.743178224" Apr 23 08:50:51.350592 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.350570 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6sq69" podStartSLOduration=2.27421173 podStartE2EDuration="20.35056411s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:50:32.436995884 +0000 UTC m=+1.829898914" lastFinishedPulling="2026-04-23 08:50:50.513348251 +0000 UTC m=+19.906251294" observedRunningTime="2026-04-23 08:50:51.339164373 +0000 UTC m=+20.732067425" watchObservedRunningTime="2026-04-23 08:50:51.35056411 +0000 UTC m=+20.743467161" Apr 23 08:50:51.362228 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.362146 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qw8wp" podStartSLOduration=6.441994317 podStartE2EDuration="20.362132626s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:50:32.39875657 +0000 UTC m=+1.791659601" lastFinishedPulling="2026-04-23 08:50:46.318894868 +0000 UTC m=+15.711797910" observedRunningTime="2026-04-23 08:50:51.361426114 +0000 UTC m=+20.754329169" watchObservedRunningTime="2026-04-23 08:50:51.362132626 +0000 UTC m=+20.755035678" Apr 23 08:50:51.372841 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.372793 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kk9bd" podStartSLOduration=2.806606709 podStartE2EDuration="20.37277781s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:50:32.424807565 +0000 UTC m=+1.817710595" lastFinishedPulling="2026-04-23 08:50:49.990978652 +0000 UTC m=+19.383881696" observedRunningTime="2026-04-23 08:50:51.37256599 +0000 UTC m=+20.765469044" watchObservedRunningTime="2026-04-23 08:50:51.37277781 +0000 UTC m=+20.765680863" Apr 23 08:50:51.404380 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:51.404338 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-th9nj" podStartSLOduration=5.404323317 podStartE2EDuration="5.404323317s" podCreationTimestamp="2026-04-23 08:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:51.403983803 +0000 UTC m=+20.796886856" watchObservedRunningTime="2026-04-23 08:50:51.404323317 +0000 UTC m=+20.797226367" Apr 23 08:50:52.024992 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.024965 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:50:52.157316 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.157198 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:50:52.02498565Z","UUID":"2fdc48ae-9518-4249-bfe8-16940738a950","Handler":null,"Name":"","Endpoint":""} Apr 23 08:50:52.160329 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.160265 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:50:52.160329 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.160296 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:50:52.180391 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.180356 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:52.180505 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.180495 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:52.180550 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:52.180507 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:52.180605 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:52.180585 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:52.180660 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.180592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:52.180731 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:52.180659 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:52.335897 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.335856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" event={"ID":"0cee8a4266df6bc7dea9113abde0fdea","Type":"ContainerStarted","Data":"53b4ebd04e2fcb995270c4d60067068721d07d1fdda86a79ec7170738e2b0f0c"} Apr 23 08:50:52.337324 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.337293 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dtwjh" event={"ID":"8134e095-58a2-4e24-a2fb-a39cfb902acf","Type":"ContainerStarted","Data":"1f602f3694141daa7c296a858f925e9ea93bd71cd7f2d26d69d5da87f1cc2d99"} Apr 23 08:50:52.339136 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.339111 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" event={"ID":"e2bc1947-82a8-463d-9645-47c17b4bb97d","Type":"ContainerStarted","Data":"080bad7a948094b755723a0f2ac6abc701081726279f7e445018e4dbc078ba39"} Apr 23 08:50:52.348324 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.348286 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-146.ec2.internal" podStartSLOduration=20.348273917 podStartE2EDuration="20.348273917s" podCreationTimestamp="2026-04-23 08:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:52.348078791 +0000 UTC m=+21.740981843" watchObservedRunningTime="2026-04-23 08:50:52.348273917 +0000 UTC m=+21.741176968" Apr 23 08:50:52.359525 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:52.359482 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dtwjh" podStartSLOduration=3.8476056400000003 podStartE2EDuration="21.359469978s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:50:32.478575347 +0000 UTC m=+1.871478377" lastFinishedPulling="2026-04-23 08:50:49.990439678 +0000 UTC m=+19.383342715" observedRunningTime="2026-04-23 08:50:52.358933177 +0000 UTC m=+21.751836229" watchObservedRunningTime="2026-04-23 08:50:52.359469978 +0000 UTC m=+21.752373029" Apr 23 08:50:53.343797 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:53.343767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 08:50:53.344496 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:53.344101 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"114ef1318c7bc0c177221d62b9e1f2c954108baa83f1e746711a129141fd7c29"} Apr 23 08:50:53.348348 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:53.348300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" event={"ID":"e2bc1947-82a8-463d-9645-47c17b4bb97d","Type":"ContainerStarted","Data":"42d1f3ab8b246a5c50161e4b496ec973aba942dcf027144eff6c42f2120f81ca"} Apr 23 08:50:53.384095 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:53.384033 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5l8z6" podStartSLOduration=2.035308274 podStartE2EDuration="22.384019179s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:50:32.47390911 +0000 UTC m=+1.866812147" lastFinishedPulling="2026-04-23 08:50:52.822620016 +0000 UTC m=+22.215523052" observedRunningTime="2026-04-23 08:50:53.38351804 +0000 UTC m=+22.776421094" watchObservedRunningTime="2026-04-23 08:50:53.384019179 +0000 UTC m=+22.776922231" Apr 23 08:50:54.180250 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:54.180215 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:54.180431 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:54.180215 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:54.180431 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:54.180330 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:54.180431 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:54.180227 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:54.180586 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:54.180412 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:54.180586 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:54.180527 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:56.006971 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.006939 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:56.007745 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.007720 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:56.179859 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.179831 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:56.180030 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.179836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:56.180030 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:56.179959 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:56.180030 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.179837 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:56.180143 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:56.180025 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:56.180143 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:56.180114 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:56.357042 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.356878 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 08:50:56.357381 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.357354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"d471642dd39f8f80c35b800c66c58de528ebe447bc6fa6129ee6345a80e995b5"} Apr 23 08:50:56.357548 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.357533 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:56.357760 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.357728 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:56.357760 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.357752 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:56.357872 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.357765 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:56.357934 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.357879 2574 scope.go:117] "RemoveContainer" containerID="5a0208c3a40815e654a5d402b607361def2b2879eb18196bcea74ecee5f0afa7" Apr 23 08:50:56.361242 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.358438 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qw8wp" Apr 23 08:50:56.373440 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.373409 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:56.373579 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:56.373531 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:50:57.361795 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:57.361764 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 08:50:57.362269 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:57.362086 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" event={"ID":"6dbd9100-2dd7-4450-a0e7-2f86e96b3487","Type":"ContainerStarted","Data":"dcf3c950465e4addbb24dc1f87d5472c619a5abf11a89a42e579dd9652d63ee9"} Apr 23 08:50:57.363443 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:57.363417 2574 generic.go:358] "Generic (PLEG): container finished" podID="21eeab6f-10d8-432b-aeab-0166ad5410c3" containerID="373d6453c2bd12c0ca1002563660c818f08d98120e1c576256d2e4296f267cb8" exitCode=0 Apr 23 08:50:57.363561 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:57.363464 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerDied","Data":"373d6453c2bd12c0ca1002563660c818f08d98120e1c576256d2e4296f267cb8"} Apr 23 08:50:57.389012 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:57.388963 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" podStartSLOduration=8.825218712 podStartE2EDuration="26.388948953s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:50:32.468612453 +0000 UTC m=+1.861515483" lastFinishedPulling="2026-04-23 08:50:50.032342691 +0000 UTC m=+19.425245724" observedRunningTime="2026-04-23 08:50:57.387916933 +0000 UTC m=+26.780819985" watchObservedRunningTime="2026-04-23 08:50:57.388948953 +0000 UTC m=+26.781851983" Apr 23 08:50:58.180581 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.180325 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:58.180581 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.180569 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:58.180888 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.180686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:58.180888 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:58.180702 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:58.180888 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:58.180809 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:58.181027 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:58.180886 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:58.216424 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.216387 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rm7wx"] Apr 23 08:50:58.218383 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.218360 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5f7d6"] Apr 23 08:50:58.218992 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.218969 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hfh7w"] Apr 23 08:50:58.367499 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.367463 2574 generic.go:358] "Generic (PLEG): container finished" podID="21eeab6f-10d8-432b-aeab-0166ad5410c3" containerID="7574839a9de07e5432f55e8a9181d6e5bf236bb90830a69d4c89e0a11f3dfc3f" exitCode=0 Apr 23 08:50:58.368093 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.367592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerDied","Data":"7574839a9de07e5432f55e8a9181d6e5bf236bb90830a69d4c89e0a11f3dfc3f"} Apr 23 08:50:58.368093 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.367633 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:50:58.368093 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:58.367726 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:50:58.368093 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.367805 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:50:58.368093 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:58.367907 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:50:58.368625 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:58.368338 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:50:58.368625 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:50:58.368434 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:50:59.371137 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:59.371106 2574 generic.go:358] "Generic (PLEG): container finished" podID="21eeab6f-10d8-432b-aeab-0166ad5410c3" containerID="b189bc854f30629c5fee554509b80b65e3d2a4d0da79fda282ea73f1026f8da0" exitCode=0 Apr 23 08:50:59.371771 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:50:59.371152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerDied","Data":"b189bc854f30629c5fee554509b80b65e3d2a4d0da79fda282ea73f1026f8da0"} Apr 23 08:51:00.180464 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:00.180425 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:51:00.180627 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:00.180535 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:51:00.180736 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:00.180620 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:51:00.180736 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:00.180686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:51:00.180852 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:00.180788 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:51:00.180936 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:00.180915 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:51:02.180538 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.180260 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:51:02.180538 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.180279 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:51:02.181056 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.180558 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rm7wx" podUID="51448c01-b78f-45a0-89ef-99a6d2c0613c" Apr 23 08:51:02.181056 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.180312 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:51:02.181056 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.180607 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f7d6" podUID="44958f50-5d35-4dcd-831d-1140d11671e5" Apr 23 08:51:02.181056 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.180674 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:51:02.402630 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.402604 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-146.ec2.internal" event="NodeReady" Apr 23 08:51:02.402808 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.402744 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:51:02.437975 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.437880 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb"] Apr 23 08:51:02.475865 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.475825 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6fd46678c7-bjsjt"] Apr 23 08:51:02.476034 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.475949 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" Apr 23 08:51:02.478626 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.478566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-b2hb9\"" Apr 23 08:51:02.478626 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.478587 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 08:51:02.478868 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.478854 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 08:51:02.479648 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.479021 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 08:51:02.479648 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.479136 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 08:51:02.493853 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.493825 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw"] Apr 23 08:51:02.494374 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.494335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.497057 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.497037 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-48746\"" Apr 23 08:51:02.497158 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.497093 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:51:02.497158 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.497099 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:51:02.497158 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.497141 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:51:02.502264 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.502241 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:51:02.512065 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.512042 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh"] Apr 23 08:51:02.512224 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.512196 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.514682 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.514663 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 08:51:02.514943 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.514924 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 08:51:02.515025 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.514943 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 08:51:02.515025 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.514991 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 08:51:02.527746 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.527723 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cb4jd"] Apr 23 08:51:02.527861 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.527822 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.530321 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.530302 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 08:51:02.539803 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.539771 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb"] Apr 23 08:51:02.539803 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.539799 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh"] Apr 23 08:51:02.539803 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.539809 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw"] Apr 23 08:51:02.539977 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.539818 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fd46678c7-bjsjt"] Apr 23 08:51:02.539977 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.539829 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t97wq"] Apr 23 08:51:02.539977 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.539870 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.541987 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.541969 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:51:02.542060 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.542003 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:51:02.542133 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.542062 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cjh2d\"" Apr 23 08:51:02.551894 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.551873 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t97wq"] Apr 23 08:51:02.551894 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.551896 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cb4jd"] Apr 23 08:51:02.552056 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.552025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:02.554350 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.554331 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nrkt9\"" Apr 23 08:51:02.554461 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.554374 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:51:02.554461 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.554402 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:51:02.554569 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.554406 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:51:02.621957 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.621911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-hub\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.622142 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.621959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.622142 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxm5j\" (UniqueName: \"kubernetes.io/projected/d58a34cf-9452-41fc-8307-814f7d1cfba4-kube-api-access-hxm5j\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.622142 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-ca\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.622142 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.622307 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nsgr\" (UniqueName: \"kubernetes.io/projected/f2a6d394-9707-4877-a957-f0de6207a808-kube-api-access-5nsgr\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.622307 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b74c9b497-qm9cb\" (UID: \"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" Apr 23 08:51:02.622307 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.622307 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2a6d394-9707-4877-a957-f0de6207a808-config-volume\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.622307 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622266 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-bound-sa-token\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.622307 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-certificates\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-image-registry-private-configuration\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-trusted-ca\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-tmp\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622403 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-klusterlet-config\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2a6d394-9707-4877-a957-f0de6207a808-tmp-dir\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622443 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wmpc\" (UniqueName: \"kubernetes.io/projected/a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92-kube-api-access-8wmpc\") pod \"managed-serviceaccount-addon-agent-7b74c9b497-qm9cb\" (UID: \"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622469 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q25fn\" (UniqueName: \"kubernetes.io/projected/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-kube-api-access-q25fn\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.622526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:02.622909 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8smt\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-kube-api-access-l8smt\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.622909 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622602 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfdzx\" (UniqueName: \"kubernetes.io/projected/5dec30cc-6570-4dfb-a0fb-88fbed75b201-kube-api-access-cfdzx\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:02.622909 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622638 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-ca-trust-extracted\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.622909 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-installation-pull-secrets\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.622909 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.622730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d58a34cf-9452-41fc-8307-814f7d1cfba4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.723940 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.723863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.723940 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.723895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nsgr\" (UniqueName: \"kubernetes.io/projected/f2a6d394-9707-4877-a957-f0de6207a808-kube-api-access-5nsgr\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.723940 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.723921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b74c9b497-qm9cb\" (UID: \"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" Apr 23 08:51:02.724203 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.723941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.724203 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.723969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2a6d394-9707-4877-a957-f0de6207a808-config-volume\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.724203 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.723992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-bound-sa-token\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.724203 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-certificates\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.724203 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.724121 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:51:02.724203 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.724143 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:51:02.724203 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.724206 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:51:03.224185431 +0000 UTC m=+32.617088481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-image-registry-private-configuration\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-trusted-ca\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-tmp\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-klusterlet-config\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2a6d394-9707-4877-a957-f0de6207a808-tmp-dir\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wmpc\" (UniqueName: \"kubernetes.io/projected/a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92-kube-api-access-8wmpc\") pod \"managed-serviceaccount-addon-agent-7b74c9b497-qm9cb\" (UID: \"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q25fn\" (UniqueName: \"kubernetes.io/projected/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-kube-api-access-q25fn\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724506 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:02.724550 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8smt\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-kube-api-access-l8smt\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724599 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfdzx\" (UniqueName: \"kubernetes.io/projected/5dec30cc-6570-4dfb-a0fb-88fbed75b201-kube-api-access-cfdzx\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-ca-trust-extracted\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-certificates\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-installation-pull-secrets\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724671 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2a6d394-9707-4877-a957-f0de6207a808-config-volume\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.724763 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.724778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-tmp\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.724823 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls podName:f2a6d394-9707-4877-a957-f0de6207a808 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:03.224805079 +0000 UTC m=+32.617708114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls") pod "dns-default-cb4jd" (UID: "f2a6d394-9707-4877-a957-f0de6207a808") : secret "dns-default-metrics-tls" not found Apr 23 08:51:02.725050 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.725047 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:02.725474 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:02.725105 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert podName:5dec30cc-6570-4dfb-a0fb-88fbed75b201 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:03.225088593 +0000 UTC m=+32.617991637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert") pod "ingress-canary-t97wq" (UID: "5dec30cc-6570-4dfb-a0fb-88fbed75b201") : secret "canary-serving-cert" not found Apr 23 08:51:02.725542 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.725513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-trusted-ca\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.725611 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.725587 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-ca-trust-extracted\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.725662 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.725644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d58a34cf-9452-41fc-8307-814f7d1cfba4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.725729 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.725683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-hub\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.725795 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.725737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.725795 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.725779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxm5j\" (UniqueName: \"kubernetes.io/projected/d58a34cf-9452-41fc-8307-814f7d1cfba4-kube-api-access-hxm5j\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.725890 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.725824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-ca\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.725963 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.725938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2a6d394-9707-4877-a957-f0de6207a808-tmp-dir\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.727326 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.727249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d58a34cf-9452-41fc-8307-814f7d1cfba4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.729806 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.729694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-ca\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.729806 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.729727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-installation-pull-secrets\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.729806 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.729759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b74c9b497-qm9cb\" (UID: \"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" Apr 23 08:51:02.729806 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.729773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.729806 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.729763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-klusterlet-config\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.730096 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.729970 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.730158 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.730139 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-image-registry-private-configuration\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.730194 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.730142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d58a34cf-9452-41fc-8307-814f7d1cfba4-hub\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.734039 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.733990 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-bound-sa-token\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.734680 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.734200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nsgr\" (UniqueName: \"kubernetes.io/projected/f2a6d394-9707-4877-a957-f0de6207a808-kube-api-access-5nsgr\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:02.734680 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.734237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8smt\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-kube-api-access-l8smt\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:02.734680 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.734433 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wmpc\" (UniqueName: \"kubernetes.io/projected/a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92-kube-api-access-8wmpc\") pod \"managed-serviceaccount-addon-agent-7b74c9b497-qm9cb\" (UID: \"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" Apr 23 08:51:02.734929 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.734820 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q25fn\" (UniqueName: \"kubernetes.io/projected/ec80fcab-79e9-4dd9-8f97-2382cb4ff154-kube-api-access-q25fn\") pod \"klusterlet-addon-workmgr-86bcfb9f44-5l2fh\" (UID: \"ec80fcab-79e9-4dd9-8f97-2382cb4ff154\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:02.735419 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.735397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxm5j\" (UniqueName: \"kubernetes.io/projected/d58a34cf-9452-41fc-8307-814f7d1cfba4-kube-api-access-hxm5j\") pod \"cluster-proxy-proxy-agent-6b55dfcd94-xf8gw\" (UID: \"d58a34cf-9452-41fc-8307-814f7d1cfba4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.735882 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.735865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfdzx\" (UniqueName: \"kubernetes.io/projected/5dec30cc-6570-4dfb-a0fb-88fbed75b201-kube-api-access-cfdzx\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:02.821889 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.821844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" Apr 23 08:51:02.822291 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.822272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:51:02.853200 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:02.853088 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:03.025940 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.025857 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh"] Apr 23 08:51:03.027044 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.027008 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb"] Apr 23 08:51:03.028204 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.028178 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw"] Apr 23 08:51:03.030566 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:51:03.030541 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7670ff1_7d1d_48ee_a89d_6e2d8e7b6a92.slice/crio-9a75c7a5706e867e536954b275c27949e3122b55b8aec907859dfe7e10d51a43 WatchSource:0}: Error finding container 9a75c7a5706e867e536954b275c27949e3122b55b8aec907859dfe7e10d51a43: Status 404 returned error can't find the container with id 9a75c7a5706e867e536954b275c27949e3122b55b8aec907859dfe7e10d51a43 Apr 23 08:51:03.031219 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:51:03.031193 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec80fcab_79e9_4dd9_8f97_2382cb4ff154.slice/crio-1d96fa8985c434a8861998b4be2678c12e90e558a5784a698b7ff92ebe59a055 WatchSource:0}: Error finding container 1d96fa8985c434a8861998b4be2678c12e90e558a5784a698b7ff92ebe59a055: Status 404 returned error can't find the container with id 1d96fa8985c434a8861998b4be2678c12e90e558a5784a698b7ff92ebe59a055 Apr 23 08:51:03.031969 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:51:03.031924 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd58a34cf_9452_41fc_8307_814f7d1cfba4.slice/crio-1b0c3f7695524a9c6df2e4381df2e1d6086b5eff6040573c2db0b98edf0b225f WatchSource:0}: Error finding container 1b0c3f7695524a9c6df2e4381df2e1d6086b5eff6040573c2db0b98edf0b225f: Status 404 returned error can't find the container with id 1b0c3f7695524a9c6df2e4381df2e1d6086b5eff6040573c2db0b98edf0b225f Apr 23 08:51:03.231672 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.231639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.231699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.231745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.231814 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.231880 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls podName:f2a6d394-9707-4877-a957-f0de6207a808 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:04.231865507 +0000 UTC m=+33.624768536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls") pod "dns-default-cb4jd" (UID: "f2a6d394-9707-4877-a957-f0de6207a808") : secret "dns-default-metrics-tls" not found Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.231815 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.231893 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.231951 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert podName:5dec30cc-6570-4dfb-a0fb-88fbed75b201 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:04.231933651 +0000 UTC m=+33.624836685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert") pod "ingress-canary-t97wq" (UID: "5dec30cc-6570-4dfb-a0fb-88fbed75b201") : secret "canary-serving-cert" not found Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.231899 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:51:03.232124 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.232005 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:51:04.231995189 +0000 UTC m=+33.624898242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:51:03.379645 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.379603 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" event={"ID":"ec80fcab-79e9-4dd9-8f97-2382cb4ff154","Type":"ContainerStarted","Data":"1d96fa8985c434a8861998b4be2678c12e90e558a5784a698b7ff92ebe59a055"} Apr 23 08:51:03.380664 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.380634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" event={"ID":"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92","Type":"ContainerStarted","Data":"9a75c7a5706e867e536954b275c27949e3122b55b8aec907859dfe7e10d51a43"} Apr 23 08:51:03.381698 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.381674 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" event={"ID":"d58a34cf-9452-41fc-8307-814f7d1cfba4","Type":"ContainerStarted","Data":"1b0c3f7695524a9c6df2e4381df2e1d6086b5eff6040573c2db0b98edf0b225f"} Apr 23 08:51:03.936738 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:03.936685 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:51:03.936935 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.936865 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:03.937000 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:03.936944 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:51:35.936924798 +0000 UTC m=+65.329827841 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:04.037269 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.037222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:51:04.037439 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.037422 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:51:04.037503 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.037447 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:51:04.037503 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.037462 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jdtws for pod openshift-network-diagnostics/network-check-target-5f7d6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:04.037604 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.037527 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws podName:44958f50-5d35-4dcd-831d-1140d11671e5 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:36.037508167 +0000 UTC m=+65.430411215 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdtws" (UniqueName: "kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws") pod "network-check-target-5f7d6" (UID: "44958f50-5d35-4dcd-831d-1140d11671e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:04.180613 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.180580 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:51:04.180613 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.180604 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:51:04.180881 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.180580 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:51:04.183356 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.183330 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:51:04.183479 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.183375 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:51:04.183479 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.183443 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:51:04.183630 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.183609 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:51:04.184387 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.184365 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9mgcs\"" Apr 23 08:51:04.184478 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.184393 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wtr6s\"" Apr 23 08:51:04.238852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.238809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:04.238852 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.238857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:04.239367 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:04.238948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:04.239367 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.238973 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:04.239367 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.239035 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls podName:f2a6d394-9707-4877-a957-f0de6207a808 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:06.239017118 +0000 UTC m=+35.631920168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls") pod "dns-default-cb4jd" (UID: "f2a6d394-9707-4877-a957-f0de6207a808") : secret "dns-default-metrics-tls" not found Apr 23 08:51:04.239367 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.239037 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:51:04.239367 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.239052 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:51:04.239367 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.239084 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:51:06.239075349 +0000 UTC m=+35.631978394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:51:04.239367 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.239085 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:04.239367 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:04.239133 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert podName:5dec30cc-6570-4dfb-a0fb-88fbed75b201 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:06.239123953 +0000 UTC m=+35.632026998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert") pod "ingress-canary-t97wq" (UID: "5dec30cc-6570-4dfb-a0fb-88fbed75b201") : secret "canary-serving-cert" not found Apr 23 08:51:06.256075 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:06.255802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:06.256075 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:06.255841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:06.256075 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:06.255948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:06.256075 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:06.255999 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:06.256075 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:06.256082 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls podName:f2a6d394-9707-4877-a957-f0de6207a808 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:10.256060832 +0000 UTC m=+39.648963877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls") pod "dns-default-cb4jd" (UID: "f2a6d394-9707-4877-a957-f0de6207a808") : secret "dns-default-metrics-tls" not found Apr 23 08:51:06.256724 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:06.256118 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:06.256724 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:06.256174 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert podName:5dec30cc-6570-4dfb-a0fb-88fbed75b201 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:10.256158831 +0000 UTC m=+39.649061876 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert") pod "ingress-canary-t97wq" (UID: "5dec30cc-6570-4dfb-a0fb-88fbed75b201") : secret "canary-serving-cert" not found Apr 23 08:51:06.257493 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:06.257390 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:51:06.257493 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:06.257411 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:51:06.257493 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:06.257465 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:51:10.257449456 +0000 UTC m=+39.650352498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:51:06.395775 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:06.394753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerStarted","Data":"ac63f597b214c8965a65654777a4de786d8c3be7e6769807ac4607e64badfd90"} Apr 23 08:51:06.763818 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:06.763701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:51:06.770165 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:06.770135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51448c01-b78f-45a0-89ef-99a6d2c0613c-original-pull-secret\") pod \"global-pull-secret-syncer-rm7wx\" (UID: \"51448c01-b78f-45a0-89ef-99a6d2c0613c\") " pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:51:06.904326 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:06.903898 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rm7wx" Apr 23 08:51:07.078323 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:07.077571 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rm7wx"] Apr 23 08:51:07.398954 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:07.398886 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rm7wx" event={"ID":"51448c01-b78f-45a0-89ef-99a6d2c0613c","Type":"ContainerStarted","Data":"3faf4ecc17a072b6d8f8ea6cc0e6d1c8444a17cb2bbba7f93c6a90e7f1e6099c"} Apr 23 08:51:07.403246 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:07.403210 2574 generic.go:358] "Generic (PLEG): container finished" podID="21eeab6f-10d8-432b-aeab-0166ad5410c3" containerID="ac63f597b214c8965a65654777a4de786d8c3be7e6769807ac4607e64badfd90" exitCode=0 Apr 23 08:51:07.403406 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:07.403300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerDied","Data":"ac63f597b214c8965a65654777a4de786d8c3be7e6769807ac4607e64badfd90"} Apr 23 08:51:08.410195 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:08.409645 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerStarted","Data":"96b90f74d68473a78487b0b6eda769003047fb864447aa88bf54d80a2a6e95b1"} Apr 23 08:51:09.415045 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:09.414999 2574 generic.go:358] "Generic (PLEG): container finished" podID="21eeab6f-10d8-432b-aeab-0166ad5410c3" containerID="96b90f74d68473a78487b0b6eda769003047fb864447aa88bf54d80a2a6e95b1" exitCode=0 Apr 23 08:51:09.415045 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:09.415045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerDied","Data":"96b90f74d68473a78487b0b6eda769003047fb864447aa88bf54d80a2a6e95b1"} Apr 23 08:51:10.296680 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:10.296629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:10.296894 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:10.296728 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:10.296894 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:10.296750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:10.296894 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:10.296811 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:51:10.296894 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:10.296837 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:51:10.296894 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:10.296844 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:10.296894 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:10.296861 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:10.297246 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:10.296903 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert podName:5dec30cc-6570-4dfb-a0fb-88fbed75b201 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:18.296885768 +0000 UTC m=+47.689788799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert") pod "ingress-canary-t97wq" (UID: "5dec30cc-6570-4dfb-a0fb-88fbed75b201") : secret "canary-serving-cert" not found Apr 23 08:51:10.297246 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:10.296921 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls podName:f2a6d394-9707-4877-a957-f0de6207a808 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:18.296912507 +0000 UTC m=+47.689815541 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls") pod "dns-default-cb4jd" (UID: "f2a6d394-9707-4877-a957-f0de6207a808") : secret "dns-default-metrics-tls" not found Apr 23 08:51:10.297246 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:10.296936 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:51:18.296928851 +0000 UTC m=+47.689831882 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:51:14.427061 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.427025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" event={"ID":"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92","Type":"ContainerStarted","Data":"9eae1c62b7e8f203715e71b8b7813370e724f3ac4976012bd91f3cf85c2af5b6"} Apr 23 08:51:14.428415 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.428388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" event={"ID":"ec80fcab-79e9-4dd9-8f97-2382cb4ff154","Type":"ContainerStarted","Data":"9845a3582ea68044bd5b35bc49efd2fcd6f3c7ebe5100100b90b6d6f99683983"} Apr 23 08:51:14.428601 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.428580 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:14.429657 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.429625 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" event={"ID":"d58a34cf-9452-41fc-8307-814f7d1cfba4","Type":"ContainerStarted","Data":"6b2db1ddc00aaf393249a8de785525ee99d114613dad1168fbbba1198facff8c"} Apr 23 08:51:14.430592 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.430573 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:51:14.431035 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.431014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rm7wx" event={"ID":"51448c01-b78f-45a0-89ef-99a6d2c0613c","Type":"ContainerStarted","Data":"4885fa509a4d6eb6130f5ea1b5882862d8c8249a7f589118e95c32156f0ec84a"} Apr 23 08:51:14.433627 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.433608 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" event={"ID":"21eeab6f-10d8-432b-aeab-0166ad5410c3","Type":"ContainerStarted","Data":"758ed06471b8e627c6a5339d6d56a37e359af10582f17745f5031eed836efc06"} Apr 23 08:51:14.442457 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.442418 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" podStartSLOduration=14.933759251 podStartE2EDuration="25.442407366s" podCreationTimestamp="2026-04-23 08:50:49 +0000 UTC" firstStartedPulling="2026-04-23 08:51:03.032802612 +0000 UTC m=+32.425705657" lastFinishedPulling="2026-04-23 08:51:13.541450729 +0000 UTC m=+42.934353772" observedRunningTime="2026-04-23 08:51:14.4422313 +0000 UTC m=+43.835134367" watchObservedRunningTime="2026-04-23 08:51:14.442407366 +0000 UTC m=+43.835310411" Apr 23 08:51:14.456514 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.456463 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rm7wx" podStartSLOduration=33.990024838 podStartE2EDuration="40.456447654s" podCreationTimestamp="2026-04-23 08:50:34 +0000 UTC" firstStartedPulling="2026-04-23 08:51:07.086374117 +0000 UTC m=+36.479277155" lastFinishedPulling="2026-04-23 08:51:13.552796934 +0000 UTC m=+42.945699971" observedRunningTime="2026-04-23 08:51:14.455957495 +0000 UTC m=+43.848860547" watchObservedRunningTime="2026-04-23 08:51:14.456447654 +0000 UTC m=+43.849350705" Apr 23 08:51:14.470832 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.470775 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" podStartSLOduration=14.962635842 podStartE2EDuration="25.470757786s" podCreationTimestamp="2026-04-23 08:50:49 +0000 UTC" firstStartedPulling="2026-04-23 08:51:03.033334917 +0000 UTC m=+32.426237946" lastFinishedPulling="2026-04-23 08:51:13.541456855 +0000 UTC m=+42.934359890" observedRunningTime="2026-04-23 08:51:14.470517935 +0000 UTC m=+43.863421000" watchObservedRunningTime="2026-04-23 08:51:14.470757786 +0000 UTC m=+43.863660842" Apr 23 08:51:14.491374 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:14.491313 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7mkzw" podStartSLOduration=9.825441078 podStartE2EDuration="43.491297041s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:50:32.45194083 +0000 UTC m=+1.844843859" lastFinishedPulling="2026-04-23 08:51:06.117796775 +0000 UTC m=+35.510699822" observedRunningTime="2026-04-23 08:51:14.489854662 +0000 UTC m=+43.882757714" watchObservedRunningTime="2026-04-23 08:51:14.491297041 +0000 UTC m=+43.884200094" Apr 23 08:51:16.439444 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:16.439413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" event={"ID":"d58a34cf-9452-41fc-8307-814f7d1cfba4","Type":"ContainerStarted","Data":"6632e099e29f4996870576dbac322273fee99e049ea96d20c4bffd93921c28f9"} Apr 23 08:51:17.444173 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:17.444136 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" event={"ID":"d58a34cf-9452-41fc-8307-814f7d1cfba4","Type":"ContainerStarted","Data":"37cf13d805e05df307aaa391746e70be8c34d1dbad394d5367430a10a2e49b2b"} Apr 23 08:51:17.463661 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:17.463609 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" podStartSLOduration=15.192980464 podStartE2EDuration="28.463596325s" podCreationTimestamp="2026-04-23 08:50:49 +0000 UTC" firstStartedPulling="2026-04-23 08:51:03.034112552 +0000 UTC m=+32.427015586" lastFinishedPulling="2026-04-23 08:51:16.304728413 +0000 UTC m=+45.697631447" observedRunningTime="2026-04-23 08:51:17.462019912 +0000 UTC m=+46.854922964" watchObservedRunningTime="2026-04-23 08:51:17.463596325 +0000 UTC m=+46.856499377" Apr 23 08:51:18.362300 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:18.362260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:18.362300 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:18.362297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:18.362520 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:18.362365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:18.362520 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:18.362422 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:18.362520 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:18.362467 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:18.362520 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:18.362491 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls podName:f2a6d394-9707-4877-a957-f0de6207a808 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:34.362475404 +0000 UTC m=+63.755378447 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls") pod "dns-default-cb4jd" (UID: "f2a6d394-9707-4877-a957-f0de6207a808") : secret "dns-default-metrics-tls" not found Apr 23 08:51:18.362520 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:18.362513 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert podName:5dec30cc-6570-4dfb-a0fb-88fbed75b201 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:34.362501689 +0000 UTC m=+63.755404725 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert") pod "ingress-canary-t97wq" (UID: "5dec30cc-6570-4dfb-a0fb-88fbed75b201") : secret "canary-serving-cert" not found Apr 23 08:51:18.362520 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:18.362469 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:51:18.362777 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:18.362530 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:51:18.362777 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:18.362558 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:51:34.362552266 +0000 UTC m=+63.755455296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:51:28.386944 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:28.386912 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qbpw" Apr 23 08:51:34.382392 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:34.382346 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:51:34.382392 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:34.382399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:51:34.382887 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:34.382419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:51:34.382887 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:34.382529 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:34.382887 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:34.382609 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls podName:f2a6d394-9707-4877-a957-f0de6207a808 nodeName:}" failed. No retries permitted until 2026-04-23 08:52:06.382592713 +0000 UTC m=+95.775495761 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls") pod "dns-default-cb4jd" (UID: "f2a6d394-9707-4877-a957-f0de6207a808") : secret "dns-default-metrics-tls" not found Apr 23 08:51:34.382887 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:34.382531 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:51:34.382887 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:34.382639 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:51:34.382887 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:34.382680 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:52:06.382669571 +0000 UTC m=+95.775572616 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:51:34.382887 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:34.382534 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:34.382887 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:34.382745 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert podName:5dec30cc-6570-4dfb-a0fb-88fbed75b201 nodeName:}" failed. No retries permitted until 2026-04-23 08:52:06.382702706 +0000 UTC m=+95.775605773 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert") pod "ingress-canary-t97wq" (UID: "5dec30cc-6570-4dfb-a0fb-88fbed75b201") : secret "canary-serving-cert" not found Apr 23 08:51:35.994256 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:35.994197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:51:35.996886 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:35.996865 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:51:36.005103 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:36.005083 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:51:36.005165 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:51:36.005156 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:52:40.005138503 +0000 UTC m=+129.398041536 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : secret "metrics-daemon-secret" not found Apr 23 08:51:36.095118 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:36.095082 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:51:36.098042 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:36.098021 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:51:36.108138 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:36.108114 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:51:36.119548 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:36.119517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdtws\" (UniqueName: \"kubernetes.io/projected/44958f50-5d35-4dcd-831d-1140d11671e5-kube-api-access-jdtws\") pod \"network-check-target-5f7d6\" (UID: \"44958f50-5d35-4dcd-831d-1140d11671e5\") " pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:51:36.312927 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:36.312899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9mgcs\"" Apr 23 08:51:36.321096 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:36.321069 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:51:36.454222 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:36.454191 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5f7d6"] Apr 23 08:51:36.458079 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:51:36.458037 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44958f50_5d35_4dcd_831d_1140d11671e5.slice/crio-7798ec0d9793a3350a4902cb7c4ea1bf5076e24e7584dcd3e6efd4052f3c7142 WatchSource:0}: Error finding container 7798ec0d9793a3350a4902cb7c4ea1bf5076e24e7584dcd3e6efd4052f3c7142: Status 404 returned error can't find the container with id 7798ec0d9793a3350a4902cb7c4ea1bf5076e24e7584dcd3e6efd4052f3c7142 Apr 23 08:51:36.498309 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:36.498277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5f7d6" event={"ID":"44958f50-5d35-4dcd-831d-1140d11671e5","Type":"ContainerStarted","Data":"7798ec0d9793a3350a4902cb7c4ea1bf5076e24e7584dcd3e6efd4052f3c7142"} Apr 23 08:51:40.509648 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:40.509611 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5f7d6" event={"ID":"44958f50-5d35-4dcd-831d-1140d11671e5","Type":"ContainerStarted","Data":"29ed5cd30381b282a1daa297cd96756396611129208f6f8c3a983eb8d1b93ff7"} Apr 23 08:51:40.510056 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:40.509834 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:51:40.525880 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:51:40.525833 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5f7d6" podStartSLOduration=66.146615377 podStartE2EDuration="1m9.525816764s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:51:36.459796591 +0000 UTC m=+65.852699621" lastFinishedPulling="2026-04-23 08:51:39.838997973 +0000 UTC m=+69.231901008" observedRunningTime="2026-04-23 08:51:40.525305957 +0000 UTC m=+69.918209012" watchObservedRunningTime="2026-04-23 08:51:40.525816764 +0000 UTC m=+69.918719819" Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:52:06.418591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:52:06.418656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:52:06.418682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:06.418770 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:06.418789 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:06.418806 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:06.418830 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:06.418855 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:53:10.41883523 +0000 UTC m=+159.811738271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:06.418872 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert podName:5dec30cc-6570-4dfb-a0fb-88fbed75b201 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:10.418864082 +0000 UTC m=+159.811767112 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert") pod "ingress-canary-t97wq" (UID: "5dec30cc-6570-4dfb-a0fb-88fbed75b201") : secret "canary-serving-cert" not found Apr 23 08:52:06.418939 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:06.418900 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls podName:f2a6d394-9707-4877-a957-f0de6207a808 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:10.418883402 +0000 UTC m=+159.811786446 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls") pod "dns-default-cb4jd" (UID: "f2a6d394-9707-4877-a957-f0de6207a808") : secret "dns-default-metrics-tls" not found Apr 23 08:52:11.514172 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:52:11.514139 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5f7d6" Apr 23 08:52:40.068500 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:52:40.068457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:52:40.069021 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:40.068569 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:52:40.069021 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:52:40.068623 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs podName:47f515b8-3d0e-4a31-898f-c3738e20428a nodeName:}" failed. No retries permitted until 2026-04-23 08:54:42.068607771 +0000 UTC m=+251.461510802 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs") pod "network-metrics-daemon-hfh7w" (UID: "47f515b8-3d0e-4a31-898f-c3738e20428a") : secret "metrics-daemon-secret" not found Apr 23 08:53:05.323124 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:05.323094 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-th9nj_c65f288c-6f59-486d-a1fb-454d54bcb237/dns-node-resolver/0.log" Apr 23 08:53:05.521289 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:05.521247 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" podUID="7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" Apr 23 08:53:05.559116 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:05.559077 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cb4jd" podUID="f2a6d394-9707-4877-a957-f0de6207a808" Apr 23 08:53:05.565199 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:05.565169 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-t97wq" podUID="5dec30cc-6570-4dfb-a0fb-88fbed75b201" Apr 23 08:53:05.704626 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:05.704532 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cb4jd" Apr 23 08:53:05.704829 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:05.704533 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:53:06.323067 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:06.323039 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kk9bd_d8ffaf53-084c-43ec-9bde-51d70f29f38b/node-ca/0.log" Apr 23 08:53:07.195175 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:07.195134 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-hfh7w" podUID="47f515b8-3d0e-4a31-898f-c3738e20428a" Apr 23 08:53:10.487240 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.487192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:53:10.487240 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.487237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:53:10.487697 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.487287 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") pod \"image-registry-6fd46678c7-bjsjt\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:53:10.487697 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:10.487374 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:53:10.487697 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:10.487385 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd46678c7-bjsjt: secret "image-registry-tls" not found Apr 23 08:53:10.487697 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:10.487436 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls podName:7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c nodeName:}" failed. No retries permitted until 2026-04-23 08:55:12.48742199 +0000 UTC m=+281.880325034 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls") pod "image-registry-6fd46678c7-bjsjt" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c") : secret "image-registry-tls" not found Apr 23 08:53:10.489551 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.489527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2a6d394-9707-4877-a957-f0de6207a808-metrics-tls\") pod \"dns-default-cb4jd\" (UID: \"f2a6d394-9707-4877-a957-f0de6207a808\") " pod="openshift-dns/dns-default-cb4jd" Apr 23 08:53:10.489675 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.489657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dec30cc-6570-4dfb-a0fb-88fbed75b201-cert\") pod \"ingress-canary-t97wq\" (UID: \"5dec30cc-6570-4dfb-a0fb-88fbed75b201\") " pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:53:10.507868 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.507835 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cjh2d\"" Apr 23 08:53:10.515556 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.515539 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cb4jd" Apr 23 08:53:10.632396 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.632362 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cb4jd"] Apr 23 08:53:10.635832 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:53:10.635798 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a6d394_9707_4877_a957_f0de6207a808.slice/crio-29d56e4d3663e25115ba8a3c3d3451ebd6aa5736e3d1db496c21935f14dc63f4 WatchSource:0}: Error finding container 29d56e4d3663e25115ba8a3c3d3451ebd6aa5736e3d1db496c21935f14dc63f4: Status 404 returned error can't find the container with id 29d56e4d3663e25115ba8a3c3d3451ebd6aa5736e3d1db496c21935f14dc63f4 Apr 23 08:53:10.717639 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:10.717606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cb4jd" event={"ID":"f2a6d394-9707-4877-a957-f0de6207a808","Type":"ContainerStarted","Data":"29d56e4d3663e25115ba8a3c3d3451ebd6aa5736e3d1db496c21935f14dc63f4"} Apr 23 08:53:12.723662 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:12.723624 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cb4jd" event={"ID":"f2a6d394-9707-4877-a957-f0de6207a808","Type":"ContainerStarted","Data":"2406bb041c1b7faab70fc2f27ad37ddc85e2d89810ae401a58b3fbfc2af04daf"} Apr 23 08:53:12.723662 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:12.723664 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cb4jd" event={"ID":"f2a6d394-9707-4877-a957-f0de6207a808","Type":"ContainerStarted","Data":"9bb14383696536ebe768ae1e37bc436e7b285c1623eef7d8f3927d52f876682d"} Apr 23 08:53:12.724160 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:12.723766 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cb4jd" Apr 23 08:53:12.743256 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:12.743201 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cb4jd" podStartSLOduration=129.232678343 podStartE2EDuration="2m10.743187759s" podCreationTimestamp="2026-04-23 08:51:02 +0000 UTC" firstStartedPulling="2026-04-23 08:53:10.637632555 +0000 UTC m=+160.030535585" lastFinishedPulling="2026-04-23 08:53:12.148141971 +0000 UTC m=+161.541045001" observedRunningTime="2026-04-23 08:53:12.742082072 +0000 UTC m=+162.134985124" watchObservedRunningTime="2026-04-23 08:53:12.743187759 +0000 UTC m=+162.136090811" Apr 23 08:53:14.429151 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:14.429087 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" podUID="ec80fcab-79e9-4dd9-8f97-2382cb4ff154" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 23 08:53:14.731250 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:14.731164 2574 generic.go:358] "Generic (PLEG): container finished" podID="a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92" containerID="9eae1c62b7e8f203715e71b8b7813370e724f3ac4976012bd91f3cf85c2af5b6" exitCode=255 Apr 23 08:53:14.731250 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:14.731240 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" event={"ID":"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92","Type":"ContainerDied","Data":"9eae1c62b7e8f203715e71b8b7813370e724f3ac4976012bd91f3cf85c2af5b6"} Apr 23 08:53:14.731592 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:14.731577 2574 scope.go:117] "RemoveContainer" containerID="9eae1c62b7e8f203715e71b8b7813370e724f3ac4976012bd91f3cf85c2af5b6" Apr 23 08:53:14.732624 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:14.732604 2574 generic.go:358] "Generic (PLEG): container finished" podID="ec80fcab-79e9-4dd9-8f97-2382cb4ff154" containerID="9845a3582ea68044bd5b35bc49efd2fcd6f3c7ebe5100100b90b6d6f99683983" exitCode=1 Apr 23 08:53:14.732741 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:14.732643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" event={"ID":"ec80fcab-79e9-4dd9-8f97-2382cb4ff154","Type":"ContainerDied","Data":"9845a3582ea68044bd5b35bc49efd2fcd6f3c7ebe5100100b90b6d6f99683983"} Apr 23 08:53:14.732935 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:14.732918 2574 scope.go:117] "RemoveContainer" containerID="9845a3582ea68044bd5b35bc49efd2fcd6f3c7ebe5100100b90b6d6f99683983" Apr 23 08:53:15.736639 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:15.736602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b74c9b497-qm9cb" event={"ID":"a7670ff1-7d1d-48ee-a89d-6e2d8e7b6a92","Type":"ContainerStarted","Data":"df00bbb6aad0321d051889558f2efb6f5b4ba3e0b7abe2e8ffa670c63ea65888"} Apr 23 08:53:15.738038 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:15.738008 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" event={"ID":"ec80fcab-79e9-4dd9-8f97-2382cb4ff154","Type":"ContainerStarted","Data":"ba0c0c1f30ada581e66261fc2e15679d97a2b7ec4faefb105a0662b0c1ccf5f3"} Apr 23 08:53:15.738290 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:15.738274 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:53:15.738871 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:15.738845 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86bcfb9f44-5l2fh" Apr 23 08:53:18.180504 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:18.180466 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:53:19.180911 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:19.180876 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:53:19.183800 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:19.183778 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nrkt9\"" Apr 23 08:53:19.191765 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:19.191749 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t97wq" Apr 23 08:53:19.307271 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:19.307238 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t97wq"] Apr 23 08:53:19.311256 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:53:19.311228 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dec30cc_6570_4dfb_a0fb_88fbed75b201.slice/crio-47fa38321bc51bac3429bed7f8635f2cbccfb3c9d93863d974b975a18df939aa WatchSource:0}: Error finding container 47fa38321bc51bac3429bed7f8635f2cbccfb3c9d93863d974b975a18df939aa: Status 404 returned error can't find the container with id 47fa38321bc51bac3429bed7f8635f2cbccfb3c9d93863d974b975a18df939aa Apr 23 08:53:19.750112 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:19.750075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t97wq" event={"ID":"5dec30cc-6570-4dfb-a0fb-88fbed75b201","Type":"ContainerStarted","Data":"47fa38321bc51bac3429bed7f8635f2cbccfb3c9d93863d974b975a18df939aa"} Apr 23 08:53:21.756637 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:21.756597 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t97wq" event={"ID":"5dec30cc-6570-4dfb-a0fb-88fbed75b201","Type":"ContainerStarted","Data":"929cc73d2e6083d6a6ab011daba35561258e3d0e2629918f0eb9824054ae0c14"} Apr 23 08:53:21.772275 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:21.772222 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t97wq" podStartSLOduration=137.893170793 podStartE2EDuration="2m19.772208738s" podCreationTimestamp="2026-04-23 08:51:02 +0000 UTC" firstStartedPulling="2026-04-23 08:53:19.313241586 +0000 UTC m=+168.706144616" lastFinishedPulling="2026-04-23 08:53:21.192279527 +0000 UTC m=+170.585182561" observedRunningTime="2026-04-23 08:53:21.771036925 +0000 UTC m=+171.163939990" watchObservedRunningTime="2026-04-23 08:53:21.772208738 +0000 UTC m=+171.165111790" Apr 23 08:53:22.729679 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:22.729650 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cb4jd" Apr 23 08:53:25.534382 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.534355 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kq47d"] Apr 23 08:53:25.537432 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.537411 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.540099 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.540072 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:53:25.540214 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.540107 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vx78w\"" Apr 23 08:53:25.541005 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.540985 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:53:25.541215 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.541202 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:53:25.541321 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.541306 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:53:25.548332 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.548310 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kq47d"] Apr 23 08:53:25.605477 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.605447 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cebef459-245a-4d33-8f89-bed32461fc84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.605621 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.605482 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrhhq\" (UniqueName: \"kubernetes.io/projected/cebef459-245a-4d33-8f89-bed32461fc84-kube-api-access-xrhhq\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.605621 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.605506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cebef459-245a-4d33-8f89-bed32461fc84-data-volume\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.605621 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.605526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cebef459-245a-4d33-8f89-bed32461fc84-crio-socket\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.605740 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.605626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cebef459-245a-4d33-8f89-bed32461fc84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.706777 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.706745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cebef459-245a-4d33-8f89-bed32461fc84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.706941 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.706784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrhhq\" (UniqueName: \"kubernetes.io/projected/cebef459-245a-4d33-8f89-bed32461fc84-kube-api-access-xrhhq\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.706941 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.706806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cebef459-245a-4d33-8f89-bed32461fc84-data-volume\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.706941 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.706825 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cebef459-245a-4d33-8f89-bed32461fc84-crio-socket\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.706941 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.706862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cebef459-245a-4d33-8f89-bed32461fc84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.707153 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.706980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cebef459-245a-4d33-8f89-bed32461fc84-crio-socket\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.707208 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.707196 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cebef459-245a-4d33-8f89-bed32461fc84-data-volume\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.707419 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.707397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cebef459-245a-4d33-8f89-bed32461fc84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.709083 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.709064 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cebef459-245a-4d33-8f89-bed32461fc84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.716114 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.716094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrhhq\" (UniqueName: \"kubernetes.io/projected/cebef459-245a-4d33-8f89-bed32461fc84-kube-api-access-xrhhq\") pod \"insights-runtime-extractor-kq47d\" (UID: \"cebef459-245a-4d33-8f89-bed32461fc84\") " pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.847038 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.846942 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kq47d" Apr 23 08:53:25.965099 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:25.965064 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kq47d"] Apr 23 08:53:25.968551 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:53:25.968523 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcebef459_245a_4d33_8f89_bed32461fc84.slice/crio-5f9315cdc55d36e366a3385c51ddc9445eeeef239d52cfd6c603a8e408052b6c WatchSource:0}: Error finding container 5f9315cdc55d36e366a3385c51ddc9445eeeef239d52cfd6c603a8e408052b6c: Status 404 returned error can't find the container with id 5f9315cdc55d36e366a3385c51ddc9445eeeef239d52cfd6c603a8e408052b6c Apr 23 08:53:26.769549 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:26.769511 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kq47d" event={"ID":"cebef459-245a-4d33-8f89-bed32461fc84","Type":"ContainerStarted","Data":"58aadf4cd5bf92d6d60fb836d0264cbc2d4f69dd345cc15809572f9ec326f9e1"} Apr 23 08:53:26.769549 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:26.769550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kq47d" event={"ID":"cebef459-245a-4d33-8f89-bed32461fc84","Type":"ContainerStarted","Data":"5f9315cdc55d36e366a3385c51ddc9445eeeef239d52cfd6c603a8e408052b6c"} Apr 23 08:53:27.773541 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:27.773501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kq47d" event={"ID":"cebef459-245a-4d33-8f89-bed32461fc84","Type":"ContainerStarted","Data":"620f86f80969a7ea4f855f13251ba6e9e49fd3976e078e39e80dc2cd6c3f2b9a"} Apr 23 08:53:29.779614 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:29.779579 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kq47d" event={"ID":"cebef459-245a-4d33-8f89-bed32461fc84","Type":"ContainerStarted","Data":"a0725d97930ccd3de1af05566721d0d6c7e5d84298b95610abb061cabcfbc2c2"} Apr 23 08:53:29.797902 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:29.797682 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kq47d" podStartSLOduration=1.755964793 podStartE2EDuration="4.797665248s" podCreationTimestamp="2026-04-23 08:53:25 +0000 UTC" firstStartedPulling="2026-04-23 08:53:26.024821184 +0000 UTC m=+175.417724228" lastFinishedPulling="2026-04-23 08:53:29.06652164 +0000 UTC m=+178.459424683" observedRunningTime="2026-04-23 08:53:29.797352518 +0000 UTC m=+179.190255571" watchObservedRunningTime="2026-04-23 08:53:29.797665248 +0000 UTC m=+179.190568301" Apr 23 08:53:32.857501 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.857472 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dbr7j"] Apr 23 08:53:32.860883 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.860859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.863622 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.863600 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:53:32.863970 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.863942 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:53:32.863970 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.863954 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:53:32.864189 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.864174 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:53:32.864868 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.864853 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:53:32.864968 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.864889 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:53:32.865024 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.864991 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x268v\"" Apr 23 08:53:32.962450 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-textfile\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.962450 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962448 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-tls\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.962618 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-accelerators-collector-config\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.962618 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-sys\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.962618 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7b6q\" (UniqueName: \"kubernetes.io/projected/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-kube-api-access-m7b6q\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.962762 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-wtmp\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.962762 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962671 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-root\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.962762 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962726 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:32.962762 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:32.962759 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-metrics-client-ca\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.063724 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063686 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-metrics-client-ca\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.063855 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-textfile\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.063855 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-tls\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.063855 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-accelerators-collector-config\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.063855 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-sys\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064022 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7b6q\" (UniqueName: \"kubernetes.io/projected/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-kube-api-access-m7b6q\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064022 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063885 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-wtmp\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064022 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063909 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-root\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064022 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-sys\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064022 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.063933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064265 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.064019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-root\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064265 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:33.064044 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 08:53:33.064265 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:33.064122 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-tls podName:ddca292b-0e11-4d75-8ec7-bb0d2bbad00c nodeName:}" failed. No retries permitted until 2026-04-23 08:53:33.564102539 +0000 UTC m=+182.957005589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-tls") pod "node-exporter-dbr7j" (UID: "ddca292b-0e11-4d75-8ec7-bb0d2bbad00c") : secret "node-exporter-tls" not found Apr 23 08:53:33.064265 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.064200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-wtmp\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064265 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.064201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-textfile\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064442 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.064363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-accelerators-collector-config\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.064442 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.064397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-metrics-client-ca\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.066168 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.066148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.072528 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.072499 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7b6q\" (UniqueName: \"kubernetes.io/projected/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-kube-api-access-m7b6q\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.567986 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.567932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-tls\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.570220 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.570189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ddca292b-0e11-4d75-8ec7-bb0d2bbad00c-node-exporter-tls\") pod \"node-exporter-dbr7j\" (UID: \"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c\") " pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.769824 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.769782 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dbr7j" Apr 23 08:53:33.777485 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:53:33.777453 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddca292b_0e11_4d75_8ec7_bb0d2bbad00c.slice/crio-f61094eaabf729254b3ecf59490a748a50c83bf91efa95940b197e1200c67883 WatchSource:0}: Error finding container f61094eaabf729254b3ecf59490a748a50c83bf91efa95940b197e1200c67883: Status 404 returned error can't find the container with id f61094eaabf729254b3ecf59490a748a50c83bf91efa95940b197e1200c67883 Apr 23 08:53:33.790229 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:33.790197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dbr7j" event={"ID":"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c","Type":"ContainerStarted","Data":"f61094eaabf729254b3ecf59490a748a50c83bf91efa95940b197e1200c67883"} Apr 23 08:53:34.794140 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:34.794110 2574 generic.go:358] "Generic (PLEG): container finished" podID="ddca292b-0e11-4d75-8ec7-bb0d2bbad00c" containerID="17cc7bf7f48485607eb08bb6608cd51047d040219f36b89b7b783703eb0b085e" exitCode=0 Apr 23 08:53:34.794508 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:34.794175 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dbr7j" event={"ID":"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c","Type":"ContainerDied","Data":"17cc7bf7f48485607eb08bb6608cd51047d040219f36b89b7b783703eb0b085e"} Apr 23 08:53:35.799095 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:35.799055 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dbr7j" event={"ID":"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c","Type":"ContainerStarted","Data":"1dd2217e602705545c36ebfdb62f218d819fdfafdde6adcee7a691a165ed774d"} Apr 23 08:53:35.799095 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:35.799095 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dbr7j" event={"ID":"ddca292b-0e11-4d75-8ec7-bb0d2bbad00c","Type":"ContainerStarted","Data":"18170d3a82847a0295ace854bd15636139f10600f6b17ffd1414354408eb04ca"} Apr 23 08:53:35.818144 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:35.818094 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dbr7j" podStartSLOduration=3.15205012 podStartE2EDuration="3.818078094s" podCreationTimestamp="2026-04-23 08:53:32 +0000 UTC" firstStartedPulling="2026-04-23 08:53:33.779260616 +0000 UTC m=+183.172163650" lastFinishedPulling="2026-04-23 08:53:34.445288576 +0000 UTC m=+183.838191624" observedRunningTime="2026-04-23 08:53:35.816948853 +0000 UTC m=+185.209851906" watchObservedRunningTime="2026-04-23 08:53:35.818078094 +0000 UTC m=+185.210981145" Apr 23 08:53:48.091843 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.091807 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6fd46678c7-bjsjt"] Apr 23 08:53:48.092354 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:53:48.092007 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" podUID="7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" Apr 23 08:53:48.830000 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.829969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:53:48.834170 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.834144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:53:48.992770 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.992741 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8smt\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-kube-api-access-l8smt\") pod \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " Apr 23 08:53:48.992943 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.992799 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-image-registry-private-configuration\") pod \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " Apr 23 08:53:48.992943 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.992828 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-bound-sa-token\") pod \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " Apr 23 08:53:48.992943 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.992860 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-trusted-ca\") pod \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " Apr 23 08:53:48.992943 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.992889 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-certificates\") pod \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " Apr 23 08:53:48.992943 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.992927 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-ca-trust-extracted\") pod \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " Apr 23 08:53:48.993196 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.992981 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-installation-pull-secrets\") pod \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\" (UID: \"7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c\") " Apr 23 08:53:48.993261 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.993235 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:53:48.993316 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.993291 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:53:48.993316 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.993302 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:53:48.995220 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.995192 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:53:48.995354 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.995262 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-kube-api-access-l8smt" (OuterVolumeSpecName: "kube-api-access-l8smt") pod "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c"). InnerVolumeSpecName "kube-api-access-l8smt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:53:48.995523 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.995498 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:53:48.995560 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:48.995515 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" (UID: "7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:53:49.094443 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.094337 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-installation-pull-secrets\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:53:49.094443 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.094382 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l8smt\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-kube-api-access-l8smt\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:53:49.094443 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.094395 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-image-registry-private-configuration\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:53:49.094443 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.094406 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-bound-sa-token\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:53:49.094443 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.094419 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-trusted-ca\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:53:49.094443 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.094428 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-certificates\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:53:49.094443 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.094437 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-ca-trust-extracted\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:53:49.832655 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.832622 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd46678c7-bjsjt" Apr 23 08:53:49.863496 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.863463 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6fd46678c7-bjsjt"] Apr 23 08:53:49.866340 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:49.866313 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6fd46678c7-bjsjt"] Apr 23 08:53:50.002324 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:50.002293 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c-registry-tls\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:53:51.184490 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:51.184457 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c" path="/var/lib/kubelet/pods/7d7fa430-3bae-49e5-9f53-c66f7c0c4c1c/volumes" Apr 23 08:53:52.823560 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:52.823519 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" podUID="d58a34cf-9452-41fc-8307-814f7d1cfba4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:53:58.466185 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.466145 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f7c5675cb-4fkp7"] Apr 23 08:53:58.471777 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.471752 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.474623 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.474596 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:53:58.474777 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.474601 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:53:58.474777 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.474608 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:53:58.474777 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.474609 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:53:58.474777 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.474630 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:53:58.475685 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.475663 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:53:58.475796 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.475759 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:53:58.475855 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.475820 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gz86g\"" Apr 23 08:53:58.478420 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.478397 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f7c5675cb-4fkp7"] Apr 23 08:53:58.569553 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.569519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-serving-cert\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.569553 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.569555 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-oauth-config\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.569803 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.569616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-oauth-serving-cert\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.569803 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.569645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-service-ca\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.569803 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.569731 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-config\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.569803 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.569765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w6b5\" (UniqueName: \"kubernetes.io/projected/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-kube-api-access-4w6b5\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.670235 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.670188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-oauth-serving-cert\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.670235 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.670245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-service-ca\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.670418 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.670279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-config\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.670418 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.670295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w6b5\" (UniqueName: \"kubernetes.io/projected/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-kube-api-access-4w6b5\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.670418 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.670314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-serving-cert\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.670418 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.670330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-oauth-config\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.671536 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.671506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-oauth-serving-cert\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.671674 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.671506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-config\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.671674 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.671593 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-service-ca\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.672746 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.672726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-oauth-config\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.672916 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.672898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-serving-cert\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.678618 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.678589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w6b5\" (UniqueName: \"kubernetes.io/projected/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-kube-api-access-4w6b5\") pod \"console-6f7c5675cb-4fkp7\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.781180 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.781147 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:53:58.901849 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:58.901817 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f7c5675cb-4fkp7"] Apr 23 08:53:58.904783 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:53:58.904756 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a9c8faa_5a03_43a1_9049_e90ffe7d8cc9.slice/crio-c5ca7a48ba67d39f0d1aed2d94db0cf2720519292eae9898fb47ce062125f948 WatchSource:0}: Error finding container c5ca7a48ba67d39f0d1aed2d94db0cf2720519292eae9898fb47ce062125f948: Status 404 returned error can't find the container with id c5ca7a48ba67d39f0d1aed2d94db0cf2720519292eae9898fb47ce062125f948 Apr 23 08:53:59.858151 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:53:59.858116 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f7c5675cb-4fkp7" event={"ID":"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9","Type":"ContainerStarted","Data":"c5ca7a48ba67d39f0d1aed2d94db0cf2720519292eae9898fb47ce062125f948"} Apr 23 08:54:01.868422 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:01.868375 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f7c5675cb-4fkp7" event={"ID":"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9","Type":"ContainerStarted","Data":"51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d"} Apr 23 08:54:01.886366 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:01.886291 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f7c5675cb-4fkp7" podStartSLOduration=1.108368589 podStartE2EDuration="3.886275841s" podCreationTimestamp="2026-04-23 08:53:58 +0000 UTC" firstStartedPulling="2026-04-23 08:53:58.906484084 +0000 UTC m=+208.299387114" lastFinishedPulling="2026-04-23 08:54:01.684391337 +0000 UTC m=+211.077294366" observedRunningTime="2026-04-23 08:54:01.885105199 +0000 UTC m=+211.278008248" watchObservedRunningTime="2026-04-23 08:54:01.886275841 +0000 UTC m=+211.279178893" Apr 23 08:54:02.823397 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:02.823358 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" podUID="d58a34cf-9452-41fc-8307-814f7d1cfba4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:54:08.781680 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:08.781636 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:54:08.781680 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:08.781690 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:54:08.786368 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:08.786346 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:54:08.889808 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:08.889775 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:54:12.823465 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:12.823429 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" podUID="d58a34cf-9452-41fc-8307-814f7d1cfba4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:54:12.823867 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:12.823493 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" Apr 23 08:54:12.824001 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:12.823968 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"37cf13d805e05df307aaa391746e70be8c34d1dbad394d5367430a10a2e49b2b"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 08:54:12.824048 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:12.824033 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" podUID="d58a34cf-9452-41fc-8307-814f7d1cfba4" containerName="service-proxy" containerID="cri-o://37cf13d805e05df307aaa391746e70be8c34d1dbad394d5367430a10a2e49b2b" gracePeriod=30 Apr 23 08:54:13.899384 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:13.899349 2574 generic.go:358] "Generic (PLEG): container finished" podID="d58a34cf-9452-41fc-8307-814f7d1cfba4" containerID="37cf13d805e05df307aaa391746e70be8c34d1dbad394d5367430a10a2e49b2b" exitCode=2 Apr 23 08:54:13.899903 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:13.899410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" event={"ID":"d58a34cf-9452-41fc-8307-814f7d1cfba4","Type":"ContainerDied","Data":"37cf13d805e05df307aaa391746e70be8c34d1dbad394d5367430a10a2e49b2b"} Apr 23 08:54:13.899903 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:13.899443 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b55dfcd94-xf8gw" event={"ID":"d58a34cf-9452-41fc-8307-814f7d1cfba4","Type":"ContainerStarted","Data":"247c2fae5af9b9e8cdc94c81d6517701d6415681d8d607027ab4dbe79fcde288"} Apr 23 08:54:19.278574 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:19.278537 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f7c5675cb-4fkp7"] Apr 23 08:54:42.077475 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:42.077427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:54:42.079857 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:42.079833 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f515b8-3d0e-4a31-898f-c3738e20428a-metrics-certs\") pod \"network-metrics-daemon-hfh7w\" (UID: \"47f515b8-3d0e-4a31-898f-c3738e20428a\") " pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:54:42.184190 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:42.184156 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wtr6s\"" Apr 23 08:54:42.192154 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:42.192130 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hfh7w" Apr 23 08:54:42.309549 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:42.309515 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hfh7w"] Apr 23 08:54:42.313208 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:54:42.313161 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47f515b8_3d0e_4a31_898f_c3738e20428a.slice/crio-13a041c4b42debe02f198b3676917fe78e8ba45b8877c013715166ce684fdafa WatchSource:0}: Error finding container 13a041c4b42debe02f198b3676917fe78e8ba45b8877c013715166ce684fdafa: Status 404 returned error can't find the container with id 13a041c4b42debe02f198b3676917fe78e8ba45b8877c013715166ce684fdafa Apr 23 08:54:42.975641 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:42.975602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hfh7w" event={"ID":"47f515b8-3d0e-4a31-898f-c3738e20428a","Type":"ContainerStarted","Data":"13a041c4b42debe02f198b3676917fe78e8ba45b8877c013715166ce684fdafa"} Apr 23 08:54:44.297460 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.297420 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f7c5675cb-4fkp7" podUID="4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" containerName="console" containerID="cri-o://51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d" gracePeriod=15 Apr 23 08:54:44.568409 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.568382 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f7c5675cb-4fkp7_4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9/console/0.log" Apr 23 08:54:44.568539 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.568447 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:54:44.696129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696097 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-service-ca\") pod \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " Apr 23 08:54:44.696129 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696136 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-serving-cert\") pod \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " Apr 23 08:54:44.696383 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696167 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-oauth-serving-cert\") pod \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " Apr 23 08:54:44.696383 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696193 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w6b5\" (UniqueName: \"kubernetes.io/projected/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-kube-api-access-4w6b5\") pod \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " Apr 23 08:54:44.696383 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696220 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-oauth-config\") pod \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " Apr 23 08:54:44.696383 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696253 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-config\") pod \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\" (UID: \"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9\") " Apr 23 08:54:44.696586 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696474 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" (UID: "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:44.696641 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696598 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" (UID: "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:44.696751 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.696698 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-config" (OuterVolumeSpecName: "console-config") pod "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" (UID: "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:44.698386 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.698363 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" (UID: "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:44.698473 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.698425 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" (UID: "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:44.698544 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.698525 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-kube-api-access-4w6b5" (OuterVolumeSpecName: "kube-api-access-4w6b5") pod "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" (UID: "4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9"). InnerVolumeSpecName "kube-api-access-4w6b5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:54:44.797079 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.797043 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-config\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:54:44.797079 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.797075 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-service-ca\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:54:44.797079 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.797084 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-serving-cert\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:54:44.797289 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.797094 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-oauth-serving-cert\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:54:44.797289 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.797103 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4w6b5\" (UniqueName: \"kubernetes.io/projected/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-kube-api-access-4w6b5\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:54:44.797289 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.797112 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9-console-oauth-config\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:54:44.983096 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.983052 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hfh7w" event={"ID":"47f515b8-3d0e-4a31-898f-c3738e20428a","Type":"ContainerStarted","Data":"c3639430ddf909d896f6d5ac6143ba00edb7d6a3d51a29b2aeaf0a3a0d9300b1"} Apr 23 08:54:44.983240 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.983104 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hfh7w" event={"ID":"47f515b8-3d0e-4a31-898f-c3738e20428a","Type":"ContainerStarted","Data":"816cf44eb9c282b535b7e61eb55e87ab3852778267a49e93f7ddb46a6a302491"} Apr 23 08:54:44.984141 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.984124 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f7c5675cb-4fkp7_4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9/console/0.log" Apr 23 08:54:44.984219 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.984160 2574 generic.go:358] "Generic (PLEG): container finished" podID="4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" containerID="51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d" exitCode=2 Apr 23 08:54:44.984219 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.984208 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f7c5675cb-4fkp7" event={"ID":"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9","Type":"ContainerDied","Data":"51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d"} Apr 23 08:54:44.984294 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.984211 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f7c5675cb-4fkp7" Apr 23 08:54:44.984294 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.984237 2574 scope.go:117] "RemoveContainer" containerID="51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d" Apr 23 08:54:44.984364 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.984228 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f7c5675cb-4fkp7" event={"ID":"4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9","Type":"ContainerDied","Data":"c5ca7a48ba67d39f0d1aed2d94db0cf2720519292eae9898fb47ce062125f948"} Apr 23 08:54:44.991694 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.991643 2574 scope.go:117] "RemoveContainer" containerID="51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d" Apr 23 08:54:44.991963 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:54:44.991940 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d\": container with ID starting with 51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d not found: ID does not exist" containerID="51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d" Apr 23 08:54:44.992027 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.991975 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d"} err="failed to get container status \"51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d\": rpc error: code = NotFound desc = could not find container \"51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d\": container with ID starting with 51fba92c7850ae831dd044f22a246b5a66d496292c43f4cef2a89c6a9c1e6b2d not found: ID does not exist" Apr 23 08:54:44.999473 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:44.999430 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hfh7w" podStartSLOduration=252.096953453 podStartE2EDuration="4m13.999418031s" podCreationTimestamp="2026-04-23 08:50:31 +0000 UTC" firstStartedPulling="2026-04-23 08:54:42.315074063 +0000 UTC m=+251.707977093" lastFinishedPulling="2026-04-23 08:54:44.217538638 +0000 UTC m=+253.610441671" observedRunningTime="2026-04-23 08:54:44.998739701 +0000 UTC m=+254.391642753" watchObservedRunningTime="2026-04-23 08:54:44.999418031 +0000 UTC m=+254.392321100" Apr 23 08:54:45.014652 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:45.014622 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f7c5675cb-4fkp7"] Apr 23 08:54:45.019691 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:45.019655 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f7c5675cb-4fkp7"] Apr 23 08:54:45.185135 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:45.185055 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" path="/var/lib/kubelet/pods/4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9/volumes" Apr 23 08:54:49.819053 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.819021 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9ccf59cb6-rxtbx"] Apr 23 08:54:49.819529 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.819253 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" containerName="console" Apr 23 08:54:49.819529 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.819264 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" containerName="console" Apr 23 08:54:49.819529 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.819307 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a9c8faa-5a03-43a1-9049-e90ffe7d8cc9" containerName="console" Apr 23 08:54:49.822018 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.821997 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:49.824537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.824506 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:54:49.824537 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.824534 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:54:49.825675 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.825653 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:54:49.825877 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.825858 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:54:49.825980 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.825916 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:54:49.826056 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.825986 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:54:49.826056 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.826001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:54:49.826267 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.826252 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gz86g\"" Apr 23 08:54:49.830613 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.830595 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 08:54:49.831259 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.831238 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9ccf59cb6-rxtbx"] Apr 23 08:54:49.934526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.934485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-config\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:49.934526 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.934524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-service-ca\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:49.934780 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.934546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-oauth-config\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:49.934780 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.934563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-serving-cert\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:49.934780 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.934613 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-oauth-serving-cert\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:49.934780 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.934630 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-trusted-ca-bundle\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:49.934780 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:49.934675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rtdt\" (UniqueName: \"kubernetes.io/projected/144a4983-0d7c-4849-8bd5-a67aa534fce7-kube-api-access-7rtdt\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.035697 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.035653 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-oauth-serving-cert\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.035697 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.035704 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-trusted-ca-bundle\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.035977 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.035761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rtdt\" (UniqueName: \"kubernetes.io/projected/144a4983-0d7c-4849-8bd5-a67aa534fce7-kube-api-access-7rtdt\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.035977 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.035810 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-config\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.035977 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.035943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-service-ca\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.036136 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.035999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-oauth-config\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.036136 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.036031 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-serving-cert\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.036506 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.036481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-oauth-serving-cert\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.036590 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.036486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-config\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.036640 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.036624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-service-ca\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.036675 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.036657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-trusted-ca-bundle\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.038470 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.038447 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-oauth-config\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.038674 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.038654 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-serving-cert\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.044694 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.044667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rtdt\" (UniqueName: \"kubernetes.io/projected/144a4983-0d7c-4849-8bd5-a67aa534fce7-kube-api-access-7rtdt\") pod \"console-9ccf59cb6-rxtbx\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.132436 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.132349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:54:50.255196 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:50.255126 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9ccf59cb6-rxtbx"] Apr 23 08:54:50.258412 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:54:50.258381 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod144a4983_0d7c_4849_8bd5_a67aa534fce7.slice/crio-112bc752f5218b1f71496e70bcf600a2e378856f6eb3b45915840042196303c7 WatchSource:0}: Error finding container 112bc752f5218b1f71496e70bcf600a2e378856f6eb3b45915840042196303c7: Status 404 returned error can't find the container with id 112bc752f5218b1f71496e70bcf600a2e378856f6eb3b45915840042196303c7 Apr 23 08:54:51.002726 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:51.002671 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9ccf59cb6-rxtbx" event={"ID":"144a4983-0d7c-4849-8bd5-a67aa534fce7","Type":"ContainerStarted","Data":"56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644"} Apr 23 08:54:51.003127 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:51.002735 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9ccf59cb6-rxtbx" event={"ID":"144a4983-0d7c-4849-8bd5-a67aa534fce7","Type":"ContainerStarted","Data":"112bc752f5218b1f71496e70bcf600a2e378856f6eb3b45915840042196303c7"} Apr 23 08:54:51.020166 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:54:51.020113 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9ccf59cb6-rxtbx" podStartSLOduration=2.020098695 podStartE2EDuration="2.020098695s" podCreationTimestamp="2026-04-23 08:54:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:54:51.019183514 +0000 UTC m=+260.412086570" watchObservedRunningTime="2026-04-23 08:54:51.020098695 +0000 UTC m=+260.413001747" Apr 23 08:55:00.132728 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:55:00.132667 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:55:00.132728 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:55:00.132737 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:55:00.137631 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:55:00.137606 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:55:01.033962 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:55:01.033931 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:55:31.093017 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:55:31.092991 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 08:55:31.093017 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:55:31.093003 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 08:55:31.101648 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:55:31.101623 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:56:46.405808 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.405771 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt"] Apr 23 08:56:46.408672 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.408654 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.412527 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.412504 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:56:46.412673 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.412525 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 23 08:56:46.412673 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.412543 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 23 08:56:46.412673 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.412577 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 08:56:46.412673 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.412529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 23 08:56:46.412673 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.412653 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-p2pcf\"" Apr 23 08:56:46.417147 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.417123 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt"] Apr 23 08:56:46.466967 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.466924 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f84cd894-7c03-43c3-af6d-373643b55d2f-metrics-certs\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.467162 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.466984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj4ct\" (UniqueName: \"kubernetes.io/projected/f84cd894-7c03-43c3-af6d-373643b55d2f-kube-api-access-vj4ct\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.467162 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.467017 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f84cd894-7c03-43c3-af6d-373643b55d2f-cert\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.467162 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.467051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f84cd894-7c03-43c3-af6d-373643b55d2f-manager-config\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.567752 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.567720 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f84cd894-7c03-43c3-af6d-373643b55d2f-metrics-certs\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.567921 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.567776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj4ct\" (UniqueName: \"kubernetes.io/projected/f84cd894-7c03-43c3-af6d-373643b55d2f-kube-api-access-vj4ct\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.567921 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.567804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f84cd894-7c03-43c3-af6d-373643b55d2f-cert\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.567921 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.567833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f84cd894-7c03-43c3-af6d-373643b55d2f-manager-config\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.568507 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.568474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f84cd894-7c03-43c3-af6d-373643b55d2f-manager-config\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.570177 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.570149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f84cd894-7c03-43c3-af6d-373643b55d2f-metrics-certs\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.570479 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.570459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f84cd894-7c03-43c3-af6d-373643b55d2f-cert\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.575856 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.575833 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj4ct\" (UniqueName: \"kubernetes.io/projected/f84cd894-7c03-43c3-af6d-373643b55d2f-kube-api-access-vj4ct\") pod \"jobset-controller-manager-578d897558-87xwt\" (UID: \"f84cd894-7c03-43c3-af6d-373643b55d2f\") " pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.719297 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.719203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:46.836962 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.836924 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt"] Apr 23 08:56:46.839802 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:56:46.839772 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf84cd894_7c03_43c3_af6d_373643b55d2f.slice/crio-a42e0e499699624718f01c2785f3ac549d100e852683565698fbb8f51ac0ab7a WatchSource:0}: Error finding container a42e0e499699624718f01c2785f3ac549d100e852683565698fbb8f51ac0ab7a: Status 404 returned error can't find the container with id a42e0e499699624718f01c2785f3ac549d100e852683565698fbb8f51ac0ab7a Apr 23 08:56:46.841649 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:46.841634 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:56:47.302440 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:47.302402 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" event={"ID":"f84cd894-7c03-43c3-af6d-373643b55d2f","Type":"ContainerStarted","Data":"a42e0e499699624718f01c2785f3ac549d100e852683565698fbb8f51ac0ab7a"} Apr 23 08:56:49.308848 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:49.308798 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" event={"ID":"f84cd894-7c03-43c3-af6d-373643b55d2f","Type":"ContainerStarted","Data":"f7305a593a0a410d90afaf2ccaba630bd1962e241a4b90a887845fa22a9e06ee"} Apr 23 08:56:49.309226 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:49.308941 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:56:49.324809 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:56:49.324757 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" podStartSLOduration=1.717406441 podStartE2EDuration="3.32474205s" podCreationTimestamp="2026-04-23 08:56:46 +0000 UTC" firstStartedPulling="2026-04-23 08:56:46.841779463 +0000 UTC m=+376.234682496" lastFinishedPulling="2026-04-23 08:56:48.449115061 +0000 UTC m=+377.842018105" observedRunningTime="2026-04-23 08:56:49.323992949 +0000 UTC m=+378.716896001" watchObservedRunningTime="2026-04-23 08:56:49.32474205 +0000 UTC m=+378.717645102" Apr 23 08:57:00.316477 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:57:00.316449 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-578d897558-87xwt" Apr 23 08:59:11.616393 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.616360 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-584c75f8d6-dj85q"] Apr 23 08:59:11.619289 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.619271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.630169 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.630142 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-584c75f8d6-dj85q"] Apr 23 08:59:11.686800 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.686769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-oauth-config\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.686800 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.686800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-trusted-ca-bundle\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.687013 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.686822 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-serving-cert\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.687013 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.686917 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-config\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.687013 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.686970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb5zk\" (UniqueName: \"kubernetes.io/projected/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-kube-api-access-bb5zk\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.687013 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.686988 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-oauth-serving-cert\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.687013 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.687005 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-service-ca\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.787757 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.787701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bb5zk\" (UniqueName: \"kubernetes.io/projected/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-kube-api-access-bb5zk\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.787757 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.787761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-oauth-serving-cert\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.787973 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.787788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-service-ca\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.787973 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.787820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-oauth-config\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.787973 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.787842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-trusted-ca-bundle\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.787973 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.787868 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-serving-cert\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.788176 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.788019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-config\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.788597 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.788571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-oauth-serving-cert\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.788749 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.788609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-service-ca\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.788749 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.788684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-config\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.788885 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.788828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-trusted-ca-bundle\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.790317 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.790288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-serving-cert\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.790317 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.790299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-console-oauth-config\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.795660 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.795638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb5zk\" (UniqueName: \"kubernetes.io/projected/022947a1-dd0a-45d4-8a9c-c12f6d7fbd41-kube-api-access-bb5zk\") pod \"console-584c75f8d6-dj85q\" (UID: \"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41\") " pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:11.928874 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:11.928759 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:12.046824 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:12.046793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-584c75f8d6-dj85q"] Apr 23 08:59:12.050579 ip-10-0-136-146 kubenswrapper[2574]: W0423 08:59:12.050546 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod022947a1_dd0a_45d4_8a9c_c12f6d7fbd41.slice/crio-fe4f348cafd66472602bd17b547736c4aeeef30e1de7a51fba77a9fdb2104415 WatchSource:0}: Error finding container fe4f348cafd66472602bd17b547736c4aeeef30e1de7a51fba77a9fdb2104415: Status 404 returned error can't find the container with id fe4f348cafd66472602bd17b547736c4aeeef30e1de7a51fba77a9fdb2104415 Apr 23 08:59:12.682988 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:12.682948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-584c75f8d6-dj85q" event={"ID":"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41","Type":"ContainerStarted","Data":"fa1bc9252fb079c682f9c440d887d19eb5eaa5a3b310b81b2eae742efca7c11b"} Apr 23 08:59:12.682988 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:12.682992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-584c75f8d6-dj85q" event={"ID":"022947a1-dd0a-45d4-8a9c-c12f6d7fbd41","Type":"ContainerStarted","Data":"fe4f348cafd66472602bd17b547736c4aeeef30e1de7a51fba77a9fdb2104415"} Apr 23 08:59:12.700936 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:12.700884 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-584c75f8d6-dj85q" podStartSLOduration=1.7008703139999999 podStartE2EDuration="1.700870314s" podCreationTimestamp="2026-04-23 08:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:59:12.699638948 +0000 UTC m=+522.092542002" watchObservedRunningTime="2026-04-23 08:59:12.700870314 +0000 UTC m=+522.093773365" Apr 23 08:59:21.929929 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:21.929825 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:21.929929 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:21.929910 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:21.934515 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:21.934495 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:22.713491 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:22.713466 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-584c75f8d6-dj85q" Apr 23 08:59:22.763097 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:22.763062 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9ccf59cb6-rxtbx"] Apr 23 08:59:47.782148 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:47.782107 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9ccf59cb6-rxtbx" podUID="144a4983-0d7c-4849-8bd5-a67aa534fce7" containerName="console" containerID="cri-o://56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644" gracePeriod=15 Apr 23 08:59:48.013500 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.013474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9ccf59cb6-rxtbx_144a4983-0d7c-4849-8bd5-a67aa534fce7/console/0.log" Apr 23 08:59:48.013622 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.013535 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:59:48.164333 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164239 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rtdt\" (UniqueName: \"kubernetes.io/projected/144a4983-0d7c-4849-8bd5-a67aa534fce7-kube-api-access-7rtdt\") pod \"144a4983-0d7c-4849-8bd5-a67aa534fce7\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " Apr 23 08:59:48.164333 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164303 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-oauth-config\") pod \"144a4983-0d7c-4849-8bd5-a67aa534fce7\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " Apr 23 08:59:48.164333 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164327 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-service-ca\") pod \"144a4983-0d7c-4849-8bd5-a67aa534fce7\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " Apr 23 08:59:48.164611 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164343 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-oauth-serving-cert\") pod \"144a4983-0d7c-4849-8bd5-a67aa534fce7\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " Apr 23 08:59:48.164611 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164378 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-trusted-ca-bundle\") pod \"144a4983-0d7c-4849-8bd5-a67aa534fce7\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " Apr 23 08:59:48.164611 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164403 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-serving-cert\") pod \"144a4983-0d7c-4849-8bd5-a67aa534fce7\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " Apr 23 08:59:48.164611 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164430 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-config\") pod \"144a4983-0d7c-4849-8bd5-a67aa534fce7\" (UID: \"144a4983-0d7c-4849-8bd5-a67aa534fce7\") " Apr 23 08:59:48.164909 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164862 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "144a4983-0d7c-4849-8bd5-a67aa534fce7" (UID: "144a4983-0d7c-4849-8bd5-a67aa534fce7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:59:48.164909 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164879 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-service-ca" (OuterVolumeSpecName: "service-ca") pod "144a4983-0d7c-4849-8bd5-a67aa534fce7" (UID: "144a4983-0d7c-4849-8bd5-a67aa534fce7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:59:48.164909 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164887 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "144a4983-0d7c-4849-8bd5-a67aa534fce7" (UID: "144a4983-0d7c-4849-8bd5-a67aa534fce7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:59:48.165104 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.164912 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-config" (OuterVolumeSpecName: "console-config") pod "144a4983-0d7c-4849-8bd5-a67aa534fce7" (UID: "144a4983-0d7c-4849-8bd5-a67aa534fce7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:59:48.166519 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.166492 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "144a4983-0d7c-4849-8bd5-a67aa534fce7" (UID: "144a4983-0d7c-4849-8bd5-a67aa534fce7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:59:48.166620 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.166514 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "144a4983-0d7c-4849-8bd5-a67aa534fce7" (UID: "144a4983-0d7c-4849-8bd5-a67aa534fce7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:59:48.166620 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.166498 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144a4983-0d7c-4849-8bd5-a67aa534fce7-kube-api-access-7rtdt" (OuterVolumeSpecName: "kube-api-access-7rtdt") pod "144a4983-0d7c-4849-8bd5-a67aa534fce7" (UID: "144a4983-0d7c-4849-8bd5-a67aa534fce7"). InnerVolumeSpecName "kube-api-access-7rtdt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:59:48.265204 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.265165 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-trusted-ca-bundle\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:59:48.265204 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.265198 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-serving-cert\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:59:48.265204 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.265210 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-config\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:59:48.265445 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.265220 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rtdt\" (UniqueName: \"kubernetes.io/projected/144a4983-0d7c-4849-8bd5-a67aa534fce7-kube-api-access-7rtdt\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:59:48.265445 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.265229 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/144a4983-0d7c-4849-8bd5-a67aa534fce7-console-oauth-config\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:59:48.265445 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.265263 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-service-ca\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:59:48.265445 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.265272 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/144a4983-0d7c-4849-8bd5-a67aa534fce7-oauth-serving-cert\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 08:59:48.782460 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.782433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9ccf59cb6-rxtbx_144a4983-0d7c-4849-8bd5-a67aa534fce7/console/0.log" Apr 23 08:59:48.782920 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.782473 2574 generic.go:358] "Generic (PLEG): container finished" podID="144a4983-0d7c-4849-8bd5-a67aa534fce7" containerID="56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644" exitCode=2 Apr 23 08:59:48.782920 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.782544 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9ccf59cb6-rxtbx" event={"ID":"144a4983-0d7c-4849-8bd5-a67aa534fce7","Type":"ContainerDied","Data":"56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644"} Apr 23 08:59:48.782920 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.782560 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9ccf59cb6-rxtbx" Apr 23 08:59:48.782920 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.782571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9ccf59cb6-rxtbx" event={"ID":"144a4983-0d7c-4849-8bd5-a67aa534fce7","Type":"ContainerDied","Data":"112bc752f5218b1f71496e70bcf600a2e378856f6eb3b45915840042196303c7"} Apr 23 08:59:48.782920 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.782586 2574 scope.go:117] "RemoveContainer" containerID="56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644" Apr 23 08:59:48.791208 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.791184 2574 scope.go:117] "RemoveContainer" containerID="56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644" Apr 23 08:59:48.791466 ip-10-0-136-146 kubenswrapper[2574]: E0423 08:59:48.791446 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644\": container with ID starting with 56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644 not found: ID does not exist" containerID="56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644" Apr 23 08:59:48.791510 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.791474 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644"} err="failed to get container status \"56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644\": rpc error: code = NotFound desc = could not find container \"56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644\": container with ID starting with 56e5eed8a5c801a910569d62e0c24343ebe2e4bbf272b0c27257215a739c3644 not found: ID does not exist" Apr 23 08:59:48.803128 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.803105 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9ccf59cb6-rxtbx"] Apr 23 08:59:48.806494 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:48.806471 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9ccf59cb6-rxtbx"] Apr 23 08:59:49.184323 ip-10-0-136-146 kubenswrapper[2574]: I0423 08:59:49.184240 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144a4983-0d7c-4849-8bd5-a67aa534fce7" path="/var/lib/kubelet/pods/144a4983-0d7c-4849-8bd5-a67aa534fce7/volumes" Apr 23 09:00:31.116126 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:00:31.116097 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 09:00:31.116586 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:00:31.116536 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 09:02:00.350719 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.350681 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g"] Apr 23 09:02:00.353155 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.350931 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144a4983-0d7c-4849-8bd5-a67aa534fce7" containerName="console" Apr 23 09:02:00.353155 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.350941 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="144a4983-0d7c-4849-8bd5-a67aa534fce7" containerName="console" Apr 23 09:02:00.353155 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.350983 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="144a4983-0d7c-4849-8bd5-a67aa534fce7" containerName="console" Apr 23 09:02:00.354040 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.354024 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:02:00.356664 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.356640 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"openshift-service-ca.crt\"" Apr 23 09:02:00.356831 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.356640 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"default-dockercfg-42m4p\"" Apr 23 09:02:00.356831 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.356680 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"kube-root-ca.crt\"" Apr 23 09:02:00.361637 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.361613 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g"] Apr 23 09:02:00.390482 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.390441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6js7m\" (UniqueName: \"kubernetes.io/projected/2b55e9e1-1597-4702-8e2b-f99983494d35-kube-api-access-6js7m\") pod \"progression-job-failure-node-0-0-f662g\" (UID: \"2b55e9e1-1597-4702-8e2b-f99983494d35\") " pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:02:00.491833 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.491785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6js7m\" (UniqueName: \"kubernetes.io/projected/2b55e9e1-1597-4702-8e2b-f99983494d35-kube-api-access-6js7m\") pod \"progression-job-failure-node-0-0-f662g\" (UID: \"2b55e9e1-1597-4702-8e2b-f99983494d35\") " pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:02:00.499806 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.499772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6js7m\" (UniqueName: \"kubernetes.io/projected/2b55e9e1-1597-4702-8e2b-f99983494d35-kube-api-access-6js7m\") pod \"progression-job-failure-node-0-0-f662g\" (UID: \"2b55e9e1-1597-4702-8e2b-f99983494d35\") " pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:02:00.663905 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.663805 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:02:00.784135 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.784041 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g"] Apr 23 09:02:00.786948 ip-10-0-136-146 kubenswrapper[2574]: W0423 09:02:00.786919 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b55e9e1_1597_4702_8e2b_f99983494d35.slice/crio-3aa6eb8423f4397f73af1673739b67b8d342ef4f9347cca5a28a573cf519f9e9 WatchSource:0}: Error finding container 3aa6eb8423f4397f73af1673739b67b8d342ef4f9347cca5a28a573cf519f9e9: Status 404 returned error can't find the container with id 3aa6eb8423f4397f73af1673739b67b8d342ef4f9347cca5a28a573cf519f9e9 Apr 23 09:02:00.788791 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:00.788777 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:02:01.127723 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:02:01.127670 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" event={"ID":"2b55e9e1-1597-4702-8e2b-f99983494d35","Type":"ContainerStarted","Data":"3aa6eb8423f4397f73af1673739b67b8d342ef4f9347cca5a28a573cf519f9e9"} Apr 23 09:03:47.428435 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:47.428393 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" event={"ID":"2b55e9e1-1597-4702-8e2b-f99983494d35","Type":"ContainerStarted","Data":"cbb0eafa4f3db8e3a37f1c01a5c14d3c7022294754629df7f25753cd92986b70"} Apr 23 09:03:47.428930 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:47.428499 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:03:47.446558 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:47.446502 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" podStartSLOduration=1.502976963 podStartE2EDuration="1m47.44648681s" podCreationTimestamp="2026-04-23 09:02:00 +0000 UTC" firstStartedPulling="2026-04-23 09:02:00.788899239 +0000 UTC m=+690.181802279" lastFinishedPulling="2026-04-23 09:03:46.732409097 +0000 UTC m=+796.125312126" observedRunningTime="2026-04-23 09:03:47.445197633 +0000 UTC m=+796.838100698" watchObservedRunningTime="2026-04-23 09:03:47.44648681 +0000 UTC m=+796.839389935" Apr 23 09:03:48.431423 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:48.431391 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:03:56.432067 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:56.432023 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" podUID="2b55e9e1-1597-4702-8e2b-f99983494d35" containerName="node" probeResult="failure" output="Get \"http://10.132.0.17:28080/metrics\": dial tcp 10.132.0.17:28080: connect: connection refused" Apr 23 09:03:56.454840 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:56.454807 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b55e9e1-1597-4702-8e2b-f99983494d35" containerID="cbb0eafa4f3db8e3a37f1c01a5c14d3c7022294754629df7f25753cd92986b70" exitCode=1 Apr 23 09:03:56.454996 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:56.454884 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" event={"ID":"2b55e9e1-1597-4702-8e2b-f99983494d35","Type":"ContainerDied","Data":"cbb0eafa4f3db8e3a37f1c01a5c14d3c7022294754629df7f25753cd92986b70"} Apr 23 09:03:57.582214 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:57.582186 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:03:57.636593 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:57.636547 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6js7m\" (UniqueName: \"kubernetes.io/projected/2b55e9e1-1597-4702-8e2b-f99983494d35-kube-api-access-6js7m\") pod \"2b55e9e1-1597-4702-8e2b-f99983494d35\" (UID: \"2b55e9e1-1597-4702-8e2b-f99983494d35\") " Apr 23 09:03:57.638736 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:57.638685 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b55e9e1-1597-4702-8e2b-f99983494d35-kube-api-access-6js7m" (OuterVolumeSpecName: "kube-api-access-6js7m") pod "2b55e9e1-1597-4702-8e2b-f99983494d35" (UID: "2b55e9e1-1597-4702-8e2b-f99983494d35"). InnerVolumeSpecName "kube-api-access-6js7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:03:57.737191 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:57.737098 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6js7m\" (UniqueName: \"kubernetes.io/projected/2b55e9e1-1597-4702-8e2b-f99983494d35-kube-api-access-6js7m\") on node \"ip-10-0-136-146.ec2.internal\" DevicePath \"\"" Apr 23 09:03:58.462388 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:58.462343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" event={"ID":"2b55e9e1-1597-4702-8e2b-f99983494d35","Type":"ContainerDied","Data":"3aa6eb8423f4397f73af1673739b67b8d342ef4f9347cca5a28a573cf519f9e9"} Apr 23 09:03:58.462388 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:58.462378 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aa6eb8423f4397f73af1673739b67b8d342ef4f9347cca5a28a573cf519f9e9" Apr 23 09:03:58.462388 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:03:58.462380 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g" Apr 23 09:04:17.432459 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:04:17.432421 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g"] Apr 23 09:04:17.435270 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:04:17.435241 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-job-failure-node-0-0-f662g"] Apr 23 09:04:19.183919 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:04:19.183866 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b55e9e1-1597-4702-8e2b-f99983494d35" path="/var/lib/kubelet/pods/2b55e9e1-1597-4702-8e2b-f99983494d35/volumes" Apr 23 09:05:08.252076 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:08.252036 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rm7wx_51448c01-b78f-45a0-89ef-99a6d2c0613c/global-pull-secret-syncer/0.log" Apr 23 09:05:08.258731 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:08.258690 2574 ???:1] "http: TLS handshake error from 10.0.141.250:51910: EOF" Apr 23 09:05:08.359318 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:08.359282 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qw8wp_280929f6-fbfd-40eb-8b83-01a18c96fa3f/konnectivity-agent/0.log" Apr 23 09:05:08.377045 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:08.377013 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-146.ec2.internal_5e2a8f2ccfc105868594a8460dd5ad37/haproxy/0.log" Apr 23 09:05:11.806616 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:11.806568 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dbr7j_ddca292b-0e11-4d75-8ec7-bb0d2bbad00c/node-exporter/0.log" Apr 23 09:05:11.822910 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:11.822881 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dbr7j_ddca292b-0e11-4d75-8ec7-bb0d2bbad00c/kube-rbac-proxy/0.log" Apr 23 09:05:11.841267 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:11.841243 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dbr7j_ddca292b-0e11-4d75-8ec7-bb0d2bbad00c/init-textfile/0.log" Apr 23 09:05:14.333385 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.333352 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-584c75f8d6-dj85q_022947a1-dd0a-45d4-8a9c-c12f6d7fbd41/console/0.log" Apr 23 09:05:14.987255 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.987210 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs"] Apr 23 09:05:14.987462 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.987450 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b55e9e1-1597-4702-8e2b-f99983494d35" containerName="node" Apr 23 09:05:14.987508 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.987464 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b55e9e1-1597-4702-8e2b-f99983494d35" containerName="node" Apr 23 09:05:14.987542 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.987514 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b55e9e1-1597-4702-8e2b-f99983494d35" containerName="node" Apr 23 09:05:14.990274 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.990253 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:14.992924 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.992900 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7xsfl\"/\"kube-root-ca.crt\"" Apr 23 09:05:14.993037 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.992966 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7xsfl\"/\"openshift-service-ca.crt\"" Apr 23 09:05:14.993972 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.993959 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7xsfl\"/\"default-dockercfg-x5hz6\"" Apr 23 09:05:14.997856 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:14.997836 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs"] Apr 23 09:05:15.106083 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.106046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-lib-modules\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.106083 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.106086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-podres\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.106292 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.106104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-sys\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.106292 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.106120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-proc\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.106292 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.106199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6nmv\" (UniqueName: \"kubernetes.io/projected/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-kube-api-access-d6nmv\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207542 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207500 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-lib-modules\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207542 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-podres\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207832 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-sys\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207832 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207577 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-proc\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207832 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6nmv\" (UniqueName: \"kubernetes.io/projected/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-kube-api-access-d6nmv\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207832 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-proc\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207832 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207691 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-podres\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207832 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-lib-modules\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.207832 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.207670 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-sys\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.215361 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.215332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6nmv\" (UniqueName: \"kubernetes.io/projected/47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2-kube-api-access-d6nmv\") pod \"perf-node-gather-daemonset-5bggs\" (UID: \"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.320647 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.320618 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cb4jd_f2a6d394-9707-4877-a957-f0de6207a808/dns/0.log" Apr 23 09:05:15.321585 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.321565 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.336173 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.336141 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cb4jd_f2a6d394-9707-4877-a957-f0de6207a808/kube-rbac-proxy/0.log" Apr 23 09:05:15.439155 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.439123 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs"] Apr 23 09:05:15.442184 ip-10-0-136-146 kubenswrapper[2574]: W0423 09:05:15.442157 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod47012c4b_a2e9_4e4b_8bd5_ad74650bd4e2.slice/crio-a60d8af8be2693a4148d0066cad5803bddfa4e72bb207d92a81da6fa28d5d902 WatchSource:0}: Error finding container a60d8af8be2693a4148d0066cad5803bddfa4e72bb207d92a81da6fa28d5d902: Status 404 returned error can't find the container with id a60d8af8be2693a4148d0066cad5803bddfa4e72bb207d92a81da6fa28d5d902 Apr 23 09:05:15.481902 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.481876 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-th9nj_c65f288c-6f59-486d-a1fb-454d54bcb237/dns-node-resolver/0.log" Apr 23 09:05:15.663466 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.663371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" event={"ID":"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2","Type":"ContainerStarted","Data":"60a825165959fdb2a473f2ec95b2d92c4f5c82a25f408b01caecba174f077d36"} Apr 23 09:05:15.663466 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.663407 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" event={"ID":"47012c4b-a2e9-4e4b-8bd5-ad74650bd4e2","Type":"ContainerStarted","Data":"a60d8af8be2693a4148d0066cad5803bddfa4e72bb207d92a81da6fa28d5d902"} Apr 23 09:05:15.663700 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.663516 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:15.680525 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.680473 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" podStartSLOduration=1.680460957 podStartE2EDuration="1.680460957s" podCreationTimestamp="2026-04-23 09:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:05:15.679106744 +0000 UTC m=+885.072009800" watchObservedRunningTime="2026-04-23 09:05:15.680460957 +0000 UTC m=+885.073364009" Apr 23 09:05:15.922151 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:15.922062 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kk9bd_d8ffaf53-084c-43ec-9bde-51d70f29f38b/node-ca/0.log" Apr 23 09:05:16.898620 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:16.898590 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t97wq_5dec30cc-6570-4dfb-a0fb-88fbed75b201/serve-healthcheck-canary/0.log" Apr 23 09:05:17.391286 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:17.391256 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kq47d_cebef459-245a-4d33-8f89-bed32461fc84/kube-rbac-proxy/0.log" Apr 23 09:05:17.405435 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:17.405401 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kq47d_cebef459-245a-4d33-8f89-bed32461fc84/exporter/0.log" Apr 23 09:05:17.420603 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:17.420575 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kq47d_cebef459-245a-4d33-8f89-bed32461fc84/extractor/0.log" Apr 23 09:05:18.953429 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:18.953395 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-578d897558-87xwt_f84cd894-7c03-43c3-af6d-373643b55d2f/manager/0.log" Apr 23 09:05:21.675321 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:21.675250 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-5bggs" Apr 23 09:05:23.220270 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.220240 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6sq69_33b4ebcd-fa1d-434d-b23c-b9216777b5a2/kube-multus/0.log" Apr 23 09:05:23.244015 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.243985 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7mkzw_21eeab6f-10d8-432b-aeab-0166ad5410c3/kube-multus-additional-cni-plugins/0.log" Apr 23 09:05:23.264865 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.264841 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7mkzw_21eeab6f-10d8-432b-aeab-0166ad5410c3/egress-router-binary-copy/0.log" Apr 23 09:05:23.282652 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.282613 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7mkzw_21eeab6f-10d8-432b-aeab-0166ad5410c3/cni-plugins/0.log" Apr 23 09:05:23.297685 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.297659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7mkzw_21eeab6f-10d8-432b-aeab-0166ad5410c3/bond-cni-plugin/0.log" Apr 23 09:05:23.314465 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.314438 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7mkzw_21eeab6f-10d8-432b-aeab-0166ad5410c3/routeoverride-cni/0.log" Apr 23 09:05:23.330918 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.330850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7mkzw_21eeab6f-10d8-432b-aeab-0166ad5410c3/whereabouts-cni-bincopy/0.log" Apr 23 09:05:23.349318 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.349295 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7mkzw_21eeab6f-10d8-432b-aeab-0166ad5410c3/whereabouts-cni/0.log" Apr 23 09:05:23.757976 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.757893 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hfh7w_47f515b8-3d0e-4a31-898f-c3738e20428a/network-metrics-daemon/0.log" Apr 23 09:05:23.771861 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:23.771830 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hfh7w_47f515b8-3d0e-4a31-898f-c3738e20428a/kube-rbac-proxy/0.log" Apr 23 09:05:24.536296 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.536262 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-controller/0.log" Apr 23 09:05:24.549908 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.549881 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/0.log" Apr 23 09:05:24.558447 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.558419 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovn-acl-logging/1.log" Apr 23 09:05:24.585668 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.585633 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/kube-rbac-proxy-node/0.log" Apr 23 09:05:24.603920 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.603891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:05:24.616528 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.616483 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/northd/0.log" Apr 23 09:05:24.630505 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.630475 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/nbdb/0.log" Apr 23 09:05:24.648426 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.648397 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/sbdb/0.log" Apr 23 09:05:24.796597 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:24.796510 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2qbpw_6dbd9100-2dd7-4450-a0e7-2f86e96b3487/ovnkube-controller/0.log" Apr 23 09:05:26.346734 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:26.346683 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5f7d6_44958f50-5d35-4dcd-831d-1140d11671e5/network-check-target-container/0.log" Apr 23 09:05:27.225677 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:27.225649 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-dtwjh_8134e095-58a2-4e24-a2fb-a39cfb902acf/iptables-alerter/0.log" Apr 23 09:05:27.833747 ip-10-0-136-146 kubenswrapper[2574]: I0423 09:05:27.833701 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4469g_23d44ac8-ae42-4654-8139-0c9ae73fb124/tuned/0.log"