Apr 24 21:14:07.834573 ip-10-0-132-219 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:14:07.834588 ip-10-0-132-219 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:14:07.834599 ip-10-0-132-219 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:14:07.835036 ip-10-0-132-219 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:14:17.865636 ip-10-0-132-219 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:14:17.865648 ip-10-0-132-219 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8670b1b1a6264313bd5c4a053e0b2383 -- Apr 24 21:16:21.793299 ip-10-0-132-219 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:16:22.308995 ip-10-0-132-219 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:22.308995 ip-10-0-132-219 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:16:22.308995 ip-10-0-132-219 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:22.308995 ip-10-0-132-219 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:16:22.308995 ip-10-0-132-219 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:22.310813 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.310717 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:16:22.317550 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317522 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.317550 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317547 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.317550 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317551 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.317550 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317554 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317558 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317561 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317563 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317567 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317572 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317576 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317584 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317587 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317590 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317593 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317595 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317598 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317602 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317604 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317607 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317609 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317612 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317615 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.317692 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317617 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317620 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317623 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317626 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317629 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317631 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317634 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317636 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317639 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317641 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317644 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317646 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317648 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317651 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317653 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317657 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317659 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317662 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317665 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317667 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.318168 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317671 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317673 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317676 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317678 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317681 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317684 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317688 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317692 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317695 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317700 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317703 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317707 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317710 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317713 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317715 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317718 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317720 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317723 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317725 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317728 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.318646 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317731 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317733 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317736 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317738 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317741 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317743 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317746 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317749 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317753 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317758 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317761 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317764 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317766 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317771 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317773 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317778 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317782 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317786 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317789 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317791 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.319151 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317794 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317797 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317800 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.317804 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318268 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318273 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318276 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318279 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318282 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318284 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318289 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318293 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318296 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318299 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318302 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318305 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318308 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318311 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318314 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.319642 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318317 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318320 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318323 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318326 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318328 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318331 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318334 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318337 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318340 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318342 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318345 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318347 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318350 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318352 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318355 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318357 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318360 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318362 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318365 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318368 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.320155 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318371 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318374 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318376 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318379 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318382 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318384 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318387 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318389 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318392 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318395 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318397 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318400 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318403 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318405 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318408 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318411 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318413 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318416 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318418 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318428 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.320667 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318431 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318433 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318436 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318439 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318442 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318444 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318447 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318450 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318452 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318455 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318457 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318460 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318462 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318464 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318467 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318469 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318472 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318475 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318477 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318480 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.321221 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318482 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318485 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318488 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318491 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318495 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318497 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318500 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318503 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318505 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318508 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.318511 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318598 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318608 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318616 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318623 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318632 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318638 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318644 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318649 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318653 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318656 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:16:22.321709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318660 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318664 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318667 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318670 2578 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318673 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318676 2578 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318679 2578 flags.go:64] FLAG: --cloud-config="" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318682 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318685 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318691 2578 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318694 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318698 2578 flags.go:64] FLAG: --config-dir="" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318700 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318704 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318708 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318712 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318715 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318718 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318722 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318725 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318728 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318731 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318735 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318739 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318743 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:16:22.322234 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318746 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318749 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318752 2578 flags.go:64] FLAG: --enable-server="true" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318755 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318759 2578 flags.go:64] FLAG: --event-burst="100" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318763 2578 flags.go:64] FLAG: --event-qps="50" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318766 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318769 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318772 2578 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318776 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318779 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318782 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318785 2578 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318788 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318792 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318796 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318799 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318802 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318806 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318809 2578 flags.go:64] FLAG: --feature-gates="" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318813 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318816 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318820 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318823 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318826 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:16:22.322879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318829 2578 flags.go:64] FLAG: --help="false" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318832 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318835 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318839 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318842 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318845 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318849 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318853 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318856 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318860 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318863 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318866 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318869 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318872 2578 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318875 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318878 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318881 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318884 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318887 2578 flags.go:64] FLAG: --lock-file="" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318890 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318893 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318896 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318902 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:16:22.323508 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318905 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318908 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318912 2578 flags.go:64] FLAG: --logging-format="text" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318915 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318918 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318934 2578 flags.go:64] FLAG: --manifest-url="" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318937 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318942 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318945 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318950 2578 flags.go:64] FLAG: --max-pods="110" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318953 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318956 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318959 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318962 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318965 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318972 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318976 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318989 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318992 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318996 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.318999 2578 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319002 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319008 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319011 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:16:22.324131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319015 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319018 2578 flags.go:64] FLAG: --port="10250" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319021 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319024 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-080c4b465145c91eb" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319027 2578 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319030 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319033 2578 flags.go:64] FLAG: --register-node="true" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319037 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319040 2578 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319043 2578 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319046 2578 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319049 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319053 2578 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319057 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319060 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319063 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319066 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319069 2578 flags.go:64] FLAG: --runonce="false" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319072 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319075 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319078 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319081 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319084 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319089 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319092 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319095 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:16:22.324710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319099 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319102 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319105 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319108 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319111 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319114 2578 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319117 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319123 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319126 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319129 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319134 2578 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319136 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319140 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319146 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319149 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319153 2578 flags.go:64] FLAG: --v="2" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319157 2578 flags.go:64] FLAG: --version="false" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319162 2578 flags.go:64] FLAG: --vmodule="" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319167 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.319170 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319279 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319283 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319286 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319289 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.325371 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319292 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319294 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319297 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319299 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319302 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319306 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319309 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319311 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319313 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319317 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319320 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319322 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319325 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319328 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319330 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319333 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319336 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319338 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319341 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319343 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.325978 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319346 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319352 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319356 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319359 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319362 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319365 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319367 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319370 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319373 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319375 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319378 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319381 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319384 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319386 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319389 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319392 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319394 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319398 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319401 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319403 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.326489 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319406 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319409 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319413 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319415 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319418 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319421 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319423 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319426 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319428 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319431 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319433 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319436 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319438 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319442 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319445 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319449 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319453 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319455 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319458 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.327002 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319461 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319463 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319466 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319469 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319471 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319474 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319476 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319479 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319482 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319484 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319488 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319491 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319493 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319496 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319498 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319501 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319504 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319506 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319509 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319511 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.327575 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319514 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.328099 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319516 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.328099 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.319519 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.328099 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.320206 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:22.328689 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.328660 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:16:22.328727 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.328689 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:16:22.328758 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328750 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.328758 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328756 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328760 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328764 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328767 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328770 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328773 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328776 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328779 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328782 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328785 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328787 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328791 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328795 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328798 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328801 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328804 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328806 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328809 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328813 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328816 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.328816 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328818 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328821 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328824 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328828 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328830 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328833 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328835 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328838 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328841 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328843 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328847 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328850 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328852 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328855 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328857 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328860 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328862 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328864 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328867 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328869 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.329355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328872 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328876 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328878 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328881 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328883 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328886 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328889 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328891 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328894 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328896 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328899 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328902 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328905 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328909 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328916 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328919 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328937 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328941 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328944 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.329846 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328946 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328949 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328952 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328955 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328958 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328961 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328964 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328966 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328969 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328971 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328974 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328977 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328979 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328982 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328984 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328987 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328989 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328992 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328994 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328997 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.330355 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.328999 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329002 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329004 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329007 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329010 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329012 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.329018 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329149 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329156 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329159 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329162 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329164 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329167 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329170 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329173 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329175 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:22.330849 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329178 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329182 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329184 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329187 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329190 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329192 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329195 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329197 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329200 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329203 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329205 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329208 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329211 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329213 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329216 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329218 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329221 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329223 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329226 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:22.331279 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329229 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329231 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329233 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329236 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329239 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329242 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329244 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329247 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329249 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329254 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329257 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329260 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329263 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329266 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329269 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329272 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329275 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329277 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329280 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:22.331750 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329282 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329285 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329287 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329290 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329292 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329295 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329297 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329300 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329302 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329305 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329307 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329309 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329312 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329314 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329317 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329319 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329322 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329326 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329329 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:22.332241 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329332 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329334 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329337 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329339 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329342 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329344 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329347 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329349 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329352 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329355 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329358 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329360 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329363 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329366 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329368 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329371 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329373 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329376 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329379 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:22.332707 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:22.329381 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:22.333218 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.329386 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:22.333218 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.330103 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:16:22.333218 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.333139 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:16:22.334219 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.334202 2578 server.go:1019] "Starting client certificate rotation" Apr 24 21:16:22.334329 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.334309 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:22.334390 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.334360 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:22.364834 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.364798 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:22.369489 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.369459 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:22.390603 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.390568 2578 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:16:22.397821 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.397794 2578 log.go:25] "Validated CRI v1 image API" Apr 24 21:16:22.399023 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.399006 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:22.399666 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.399647 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:16:22.406967 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.406914 2578 fs.go:135] Filesystem UUIDs: map[4aab5d88-b84c-437d-856a-ba2d6f2bed82:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 cf6fbf8a-53d2-4b5e-bd27-97adb09859e9:/dev/nvme0n1p3] Apr 24 21:16:22.406967 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.406961 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:16:22.412354 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.412210 2578 manager.go:217] Machine: {Timestamp:2026-04-24 21:16:22.410968141 +0000 UTC m=+0.478573783 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3070449 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23e1f2088ee0dd895d38080250502a SystemUUID:ec23e1f2-088e-e0dd-895d-38080250502a BootID:8670b1b1-a626-4313-bd5c-4a053e0b2383 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:94:17:b1:72:a9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:94:17:b1:72:a9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:f6:81:b9:04:08 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:16:22.412354 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.412340 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:16:22.412481 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.412436 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:16:22.413663 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.413625 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:16:22.413820 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.413665 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-219.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:16:22.413864 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.413830 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:16:22.413864 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.413839 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:16:22.413864 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.413853 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:22.413964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.413870 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:22.415545 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.415529 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:22.415685 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.415673 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:16:22.418621 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.418605 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:16:22.418665 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.418637 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:16:22.418665 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.418659 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:16:22.418727 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.418673 2578 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:16:22.418727 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.418697 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:16:22.419780 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.419766 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:22.419845 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.419787 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:22.423310 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.423282 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:16:22.424815 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.424799 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:16:22.427176 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427158 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427183 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427193 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427199 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427205 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427218 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427224 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427231 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427238 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427245 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427257 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:16:22.427266 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.427267 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:16:22.428211 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.428201 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:16:22.428211 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.428211 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:16:22.430648 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.430623 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-219.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:22.430766 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.430713 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-219.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:22.431381 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.431358 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:22.432024 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.432011 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:16:22.432073 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.432052 2578 server.go:1295] "Started kubelet" Apr 24 21:16:22.432168 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.432142 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:16:22.432827 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.432772 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:16:22.432882 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.432853 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:16:22.432889 ip-10-0-132-219 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:16:22.434723 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.434700 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:16:22.435088 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.435073 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:16:22.440955 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.440911 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:22.441900 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.441873 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:16:22.442209 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.442182 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xtwhv" Apr 24 21:16:22.443270 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.443242 2578 factory.go:55] Registering systemd factory Apr 24 21:16:22.443411 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.443390 2578 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:16:22.443506 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.443251 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:16:22.443506 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.443328 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:16:22.443506 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.443340 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:22.443506 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.443446 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:16:22.443688 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.443677 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:16:22.443738 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.443690 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:16:22.446296 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.446277 2578 factory.go:153] Registering CRI-O factory Apr 24 21:16:22.446415 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.446405 2578 factory.go:223] Registration of the crio container factory successfully Apr 24 21:16:22.446601 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.446587 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:16:22.446734 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.446723 2578 factory.go:103] Registering Raw factory Apr 24 21:16:22.446829 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.446820 2578 manager.go:1196] Started watching for new ooms in manager Apr 24 21:16:22.448010 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.447993 2578 manager.go:319] Starting recovery of all containers Apr 24 21:16:22.448550 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.448516 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:16:22.448900 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.448847 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xtwhv" Apr 24 21:16:22.449102 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.449070 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-219.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:16:22.449301 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.449259 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:16:22.450696 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.449097 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-219.ec2.internal.18a9678cce812679 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-219.ec2.internal,UID:ip-10-0-132-219.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-219.ec2.internal,},FirstTimestamp:2026-04-24 21:16:22.432024185 +0000 UTC m=+0.499629755,LastTimestamp:2026-04-24 21:16:22.432024185 +0000 UTC m=+0.499629755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-219.ec2.internal,}" Apr 24 21:16:22.459046 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.458894 2578 manager.go:324] Recovery completed Apr 24 21:16:22.463892 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.463871 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.467436 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.467414 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.467533 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.467455 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.467533 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.467470 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.468113 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.468097 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:16:22.468113 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.468112 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:16:22.468178 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.468131 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:22.470640 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.470626 2578 policy_none.go:49] "None policy: Start" Apr 24 21:16:22.470691 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.470646 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:16:22.470691 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.470656 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:16:22.510649 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.510628 2578 manager.go:341] "Starting Device Plugin manager" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.510673 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.510688 2578 server.go:85] "Starting device plugin registration server" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.510981 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.510995 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.511101 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.511206 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.511215 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.511763 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:16:22.521083 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.511805 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:22.521450 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.521325 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:16:22.522715 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.522688 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:16:22.522846 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.522729 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:16:22.522846 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.522755 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:16:22.522846 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.522764 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:16:22.522846 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.522808 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:16:22.526065 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.526043 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:22.611516 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.611434 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.612367 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.612350 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.612486 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.612386 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.612486 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.612400 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.612486 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.612432 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.621571 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.621549 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.621658 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.621579 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-219.ec2.internal\": node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:22.623817 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.623790 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal"] Apr 24 21:16:22.623902 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.623890 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.624799 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.624782 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.624906 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.624815 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.624906 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.624831 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.628014 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.627993 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.630239 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.628864 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.630239 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.628956 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.631294 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.631269 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.631388 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.631301 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.631388 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.631312 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.631388 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.631276 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.631388 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.631369 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.631388 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.631379 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.632743 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.632729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.632789 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.632756 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:22.633504 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.633481 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:22.633580 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.633513 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:22.633580 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.633524 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:22.640915 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.640896 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:22.645141 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.645120 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9b4b9a04da2f64c015ad90ce08dfcded-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal\" (UID: \"9b4b9a04da2f64c015ad90ce08dfcded\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.645239 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.645148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b4b9a04da2f64c015ad90ce08dfcded-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal\" (UID: \"9b4b9a04da2f64c015ad90ce08dfcded\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.645239 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.645169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/88cded07b618f16d406fa0098b2baa8d-config\") pod \"kube-apiserver-proxy-ip-10-0-132-219.ec2.internal\" (UID: \"88cded07b618f16d406fa0098b2baa8d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.657039 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.657000 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-219.ec2.internal\" not found" node="ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.661919 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.661899 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-219.ec2.internal\" not found" node="ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.741399 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.741360 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:22.745782 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.745758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9b4b9a04da2f64c015ad90ce08dfcded-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal\" (UID: \"9b4b9a04da2f64c015ad90ce08dfcded\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.745883 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.745797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b4b9a04da2f64c015ad90ce08dfcded-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal\" (UID: \"9b4b9a04da2f64c015ad90ce08dfcded\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.745883 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.745815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/88cded07b618f16d406fa0098b2baa8d-config\") pod \"kube-apiserver-proxy-ip-10-0-132-219.ec2.internal\" (UID: \"88cded07b618f16d406fa0098b2baa8d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.745883 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.745844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/88cded07b618f16d406fa0098b2baa8d-config\") pod \"kube-apiserver-proxy-ip-10-0-132-219.ec2.internal\" (UID: \"88cded07b618f16d406fa0098b2baa8d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.745883 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.745848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9b4b9a04da2f64c015ad90ce08dfcded-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal\" (UID: \"9b4b9a04da2f64c015ad90ce08dfcded\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.745883 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.745852 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b4b9a04da2f64c015ad90ce08dfcded-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal\" (UID: \"9b4b9a04da2f64c015ad90ce08dfcded\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.841833 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.841791 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:22.942703 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:22.942629 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:22.959089 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.959060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:22.965036 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:22.965011 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" Apr 24 21:16:23.042740 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:23.042697 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:23.143186 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:23.143145 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:23.243785 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:23.243697 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:23.334336 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.334294 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:16:23.334801 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.334494 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:23.344684 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:23.344650 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:23.441228 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.441190 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:23.445355 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:23.445327 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:23.451553 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.451523 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:23.451694 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.451661 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:11:22 +0000 UTC" deadline="2028-01-06 05:20:02.111828504 +0000 UTC" Apr 24 21:16:23.451694 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.451679 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14912h3m38.660152753s" Apr 24 21:16:23.506122 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:23.506086 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88cded07b618f16d406fa0098b2baa8d.slice/crio-b686fbd531ad32c80012097d61d4700ba74e9da914c2e1a3d770c1b2a0375ff6 WatchSource:0}: Error finding container b686fbd531ad32c80012097d61d4700ba74e9da914c2e1a3d770c1b2a0375ff6: Status 404 returned error can't find the container with id b686fbd531ad32c80012097d61d4700ba74e9da914c2e1a3d770c1b2a0375ff6 Apr 24 21:16:23.506514 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:23.506473 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4b9a04da2f64c015ad90ce08dfcded.slice/crio-a7ebf407f390fbb739c4ffd1d119d87ff6d8273eda6f383a14eae0da595d94e1 WatchSource:0}: Error finding container a7ebf407f390fbb739c4ffd1d119d87ff6d8273eda6f383a14eae0da595d94e1: Status 404 returned error can't find the container with id a7ebf407f390fbb739c4ffd1d119d87ff6d8273eda6f383a14eae0da595d94e1 Apr 24 21:16:23.506809 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.506793 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-75q2s" Apr 24 21:16:23.510885 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.510867 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:16:23.514522 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.514502 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-75q2s" Apr 24 21:16:23.525495 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.525446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" event={"ID":"88cded07b618f16d406fa0098b2baa8d","Type":"ContainerStarted","Data":"b686fbd531ad32c80012097d61d4700ba74e9da914c2e1a3d770c1b2a0375ff6"} Apr 24 21:16:23.526412 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.526385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" event={"ID":"9b4b9a04da2f64c015ad90ce08dfcded","Type":"ContainerStarted","Data":"a7ebf407f390fbb739c4ffd1d119d87ff6d8273eda6f383a14eae0da595d94e1"} Apr 24 21:16:23.545861 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:23.545823 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:23.646426 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:23.646380 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-219.ec2.internal\" not found" Apr 24 21:16:23.686858 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.686831 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:23.742678 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.742642 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" Apr 24 21:16:23.749227 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.749155 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:23.752292 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.752269 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:23.754420 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.754400 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" Apr 24 21:16:23.764965 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.764942 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:23.811037 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:23.811004 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:24.419680 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.419646 2578 apiserver.go:52] "Watching apiserver" Apr 24 21:16:24.426793 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.426751 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:16:24.427203 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.427178 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7p6s9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal","openshift-multus/multus-6fjn7","openshift-multus/network-metrics-daemon-4xbxm","openshift-network-operator/iptables-alerter-fr8rd","kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal","openshift-multus/multus-additional-cni-plugins-wpmrx","openshift-network-diagnostics/network-check-target-pnbsk","openshift-ovn-kubernetes/ovnkube-node-pqvlx","kube-system/konnectivity-agent-q9jpr","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8","openshift-cluster-node-tuning-operator/tuned-h5rbb"] Apr 24 21:16:24.429722 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.429690 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.430834 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.430798 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.430968 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.430911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:24.431033 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:24.430998 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:24.433921 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.433885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.434253 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.434233 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jrbtm\"" Apr 24 21:16:24.434346 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.434314 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:16:24.434453 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.434432 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.434528 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.434401 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.434528 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.434520 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:16:24.434626 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.434537 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:16:24.434714 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.434696 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-t525h\"" Apr 24 21:16:24.435051 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.435024 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.435141 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.435033 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.435436 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.435415 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.436024 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.435886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:24.436024 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:24.435996 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:24.436679 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.436415 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fzlbr\"" Apr 24 21:16:24.436679 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.436503 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.436952 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.436920 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:16:24.437070 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.437052 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.437558 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.437486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.438010 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.437722 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:16:24.438245 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.438226 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:16:24.438331 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.438262 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-czgd4\"" Apr 24 21:16:24.438817 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.438799 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:24.439392 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.439375 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:16:24.439650 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.439633 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.439721 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.439635 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:16:24.440078 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.439941 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:16:24.440140 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.440131 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qcgmz\"" Apr 24 21:16:24.440243 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.440226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.440587 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.440565 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:16:24.440812 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.440771 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.441058 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.441035 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j85kl\"" Apr 24 21:16:24.441151 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.441065 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:16:24.441215 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.441148 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:16:24.441882 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.441863 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.442625 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.442608 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.442707 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.442651 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-ft8qp\"" Apr 24 21:16:24.443417 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.443399 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:16:24.443504 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.443450 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.443792 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.443771 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:24.443890 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.443793 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:24.444077 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.444062 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lz4gl\"" Apr 24 21:16:24.444522 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.444507 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:16:24.455760 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.455730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-cni-bin\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.455903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.455771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-env-overrides\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.455903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.455799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-etc-selinux\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.455903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.455828 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-sys-fs\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.455903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.455865 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-lib-modules\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.455903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.455893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-host\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.456185 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.455920 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:24.456185 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456005 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.456185 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-ovn\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.456185 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456087 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-modprobe-d\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.456185 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-tmp\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.456185 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-ovnkube-config\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.456185 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-run\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-os-release\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-socket-dir-parent\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-slash\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnm7k\" (UniqueName: \"kubernetes.io/projected/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-kube-api-access-cnm7k\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-multus-certs\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-log-socket\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456417 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-ovnkube-script-lib\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-sys\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456472 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-cni-multus\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.456509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-kubelet\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zxj4\" (UniqueName: \"kubernetes.io/projected/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-kube-api-access-7zxj4\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-kubelet\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-node-log\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-cni-netd\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhbmc\" (UniqueName: \"kubernetes.io/projected/4296bff6-4cb8-423e-a188-d5e73736c322-kube-api-access-dhbmc\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456632 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwpj\" (UniqueName: \"kubernetes.io/projected/d1b4f3db-7875-478c-93b8-7c3155edd974-kube-api-access-zbwpj\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-etc-kubernetes\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-registration-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtldr\" (UniqueName: \"kubernetes.io/projected/27ff64ed-25a6-46c3-a473-b5c08e56c10f-kube-api-access-gtldr\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456855 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysctl-conf\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-system-cni-dir\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-cnibin\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.457042 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.456982 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-os-release\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-k8s-cni-cncf-io\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-hostroot\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-run-netns\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457107 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-systemd\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-host\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-daemon-config\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-systemd-units\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm94c\" (UniqueName: \"kubernetes.io/projected/74c3a396-cd8f-4290-9e5d-1a182b254157-kube-api-access-qm94c\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4296bff6-4cb8-423e-a188-d5e73736c322-iptables-alerter-script\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/647bf41d-3e57-40c7-bd39-3154d24499dd-agent-certs\") pod \"konnectivity-agent-q9jpr\" (UID: \"647bf41d-3e57-40c7-bd39-3154d24499dd\") " pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-system-cni-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysconfig\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-tuned\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-cni-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysctl-d\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.457844 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-systemd\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-var-lib-kubelet\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-serviceca\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457496 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv4gq\" (UniqueName: \"kubernetes.io/projected/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-kube-api-access-jv4gq\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457512 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-cnibin\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-cni-bin\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-kubernetes\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j44dl\" (UniqueName: \"kubernetes.io/projected/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-kube-api-access-j44dl\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/647bf41d-3e57-40c7-bd39-3154d24499dd-konnectivity-ca\") pod \"konnectivity-agent-q9jpr\" (UID: \"647bf41d-3e57-40c7-bd39-3154d24499dd\") " pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-conf-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-etc-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-socket-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4296bff6-4cb8-423e-a188-d5e73736c322-host-slash\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457809 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-cni-binary-copy\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.458590 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-netns\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.459274 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74c3a396-cd8f-4290-9e5d-1a182b254157-ovn-node-metrics-cert\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.459274 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-var-lib-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.459274 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.457972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.459274 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.458006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-device-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.515445 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.515398 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:23 +0000 UTC" deadline="2028-02-06 20:09:29.266259742 +0000 UTC" Apr 24 21:16:24.515445 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.515433 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15670h53m4.750830228s" Apr 24 21:16:24.558262 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.558262 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-systemd\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.558505 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-var-lib-kubelet\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.558505 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-var-lib-kubelet\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.558505 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558376 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-serviceca\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.558505 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558410 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv4gq\" (UniqueName: \"kubernetes.io/projected/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-kube-api-access-jv4gq\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:24.558505 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558416 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-systemd\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.558505 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-cnibin\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.558505 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-cni-bin\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.558505 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-cni-bin\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-cnibin\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-kubernetes\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j44dl\" (UniqueName: \"kubernetes.io/projected/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-kube-api-access-j44dl\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/647bf41d-3e57-40c7-bd39-3154d24499dd-konnectivity-ca\") pod \"konnectivity-agent-q9jpr\" (UID: \"647bf41d-3e57-40c7-bd39-3154d24499dd\") " pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-conf-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558716 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-kubernetes\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-etc-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558756 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-etc-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-socket-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4296bff6-4cb8-423e-a188-d5e73736c322-host-slash\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-conf-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-cni-binary-copy\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.558866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558865 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-netns\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558872 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74c3a396-cd8f-4290-9e5d-1a182b254157-ovn-node-metrics-cert\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558872 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-serviceca\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-var-lib-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-netns\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-device-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-cni-bin\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-env-overrides\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559081 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-var-lib-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-etc-selinux\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-sys-fs\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-device-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559150 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-lib-modules\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-host\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:24.559671 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-ovn\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-modprobe-d\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559282 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-lib-modules\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559302 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-tmp\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559324 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-cni-bin\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-ovnkube-config\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559333 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.558938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-socket-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/647bf41d-3e57-40c7-bd39-3154d24499dd-konnectivity-ca\") pod \"konnectivity-agent-q9jpr\" (UID: \"647bf41d-3e57-40c7-bd39-3154d24499dd\") " pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559421 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-openvswitch\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-host\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559464 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4296bff6-4cb8-423e-a188-d5e73736c322-host-slash\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559513 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-cni-binary-copy\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559531 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-modprobe-d\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-run-ovn\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:24.559586 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:24.560447 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-sys-fs\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:24.559692 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs podName:bfa91d92-6b3a-44d1-8b22-abae9ded2a1c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:25.0596715 +0000 UTC m=+3.127277058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs") pod "network-metrics-daemon-4xbxm" (UID: "bfa91d92-6b3a-44d1-8b22-abae9ded2a1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-etc-selinux\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-run\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-os-release\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-socket-dir-parent\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-slash\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-ovnkube-config\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnm7k\" (UniqueName: \"kubernetes.io/projected/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-kube-api-access-cnm7k\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-multus-certs\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.559954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-os-release\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560008 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-run\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-log-socket\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-env-overrides\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-ovnkube-script-lib\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-socket-dir-parent\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.561325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-sys\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-multus-certs\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560151 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-slash\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-cni-multus\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560179 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-log-socket\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560193 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-kubelet\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-sys\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560224 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-cni-multus\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560232 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-var-lib-kubelet\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zxj4\" (UniqueName: \"kubernetes.io/projected/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-kube-api-access-7zxj4\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-kubelet\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-node-log\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-cni-netd\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhbmc\" (UniqueName: \"kubernetes.io/projected/4296bff6-4cb8-423e-a188-d5e73736c322-kube-api-access-dhbmc\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-kubelet\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-node-log\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.562057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwpj\" (UniqueName: \"kubernetes.io/projected/d1b4f3db-7875-478c-93b8-7c3155edd974-kube-api-access-zbwpj\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-etc-kubernetes\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-registration-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtldr\" (UniqueName: \"kubernetes.io/projected/27ff64ed-25a6-46c3-a473-b5c08e56c10f-kube-api-access-gtldr\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysctl-conf\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-system-cni-dir\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74c3a396-cd8f-4290-9e5d-1a182b254157-ovnkube-script-lib\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560658 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-cni-netd\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-cnibin\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/27ff64ed-25a6-46c3-a473-b5c08e56c10f-registration-dir\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560800 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-cnibin\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-etc-kubernetes\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-os-release\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.562644 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-k8s-cni-cncf-io\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysctl-conf\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-system-cni-dir\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1b4f3db-7875-478c-93b8-7c3155edd974-os-release\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561001 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-host-run-k8s-cni-cncf-io\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.560989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-hostroot\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561039 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-run-netns\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561073 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-hostroot\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-systemd\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-host-run-netns\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561122 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-host\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-daemon-config\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561166 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-systemd\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-systemd-units\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-host\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74c3a396-cd8f-4290-9e5d-1a182b254157-systemd-units\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561222 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm94c\" (UniqueName: \"kubernetes.io/projected/74c3a396-cd8f-4290-9e5d-1a182b254157-kube-api-access-qm94c\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4296bff6-4cb8-423e-a188-d5e73736c322-iptables-alerter-script\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.563456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d1b4f3db-7875-478c-93b8-7c3155edd974-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/647bf41d-3e57-40c7-bd39-3154d24499dd-agent-certs\") pod \"konnectivity-agent-q9jpr\" (UID: \"647bf41d-3e57-40c7-bd39-3154d24499dd\") " pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-system-cni-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysconfig\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-tuned\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-cni-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysctl-d\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-system-cni-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561531 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-cni-dir\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561630 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysctl-d\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561647 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-multus-daemon-config\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561696 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-sysconfig\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.561854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4296bff6-4cb8-423e-a188-d5e73736c322-iptables-alerter-script\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.563606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74c3a396-cd8f-4290-9e5d-1a182b254157-ovn-node-metrics-cert\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.563850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-etc-tuned\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.563903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/647bf41d-3e57-40c7-bd39-3154d24499dd-agent-certs\") pod \"konnectivity-agent-q9jpr\" (UID: \"647bf41d-3e57-40c7-bd39-3154d24499dd\") " pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:24.564690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.563858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-tmp\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.566330 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:24.566271 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:24.566330 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:24.566299 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:24.566330 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:24.566313 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x9r5w for pod openshift-network-diagnostics/network-check-target-pnbsk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:24.566621 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:24.566391 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w podName:763a8f03-407c-4bd4-b683-27ba0614f163 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:25.066370068 +0000 UTC m=+3.133975627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x9r5w" (UniqueName: "kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w") pod "network-check-target-pnbsk" (UID: "763a8f03-407c-4bd4-b683-27ba0614f163") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:24.567403 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.567376 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv4gq\" (UniqueName: \"kubernetes.io/projected/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-kube-api-access-jv4gq\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:24.567563 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.567438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j44dl\" (UniqueName: \"kubernetes.io/projected/5f9a3a11-4bde-44e9-a68d-2d9ababf72d3-kube-api-access-j44dl\") pod \"node-ca-7p6s9\" (UID: \"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3\") " pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.568415 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.568392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnm7k\" (UniqueName: \"kubernetes.io/projected/b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5-kube-api-access-cnm7k\") pod \"tuned-h5rbb\" (UID: \"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5\") " pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:24.570376 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.570352 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhbmc\" (UniqueName: \"kubernetes.io/projected/4296bff6-4cb8-423e-a188-d5e73736c322-kube-api-access-dhbmc\") pod \"iptables-alerter-fr8rd\" (UID: \"4296bff6-4cb8-423e-a188-d5e73736c322\") " pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.570960 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.570905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zxj4\" (UniqueName: \"kubernetes.io/projected/b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb-kube-api-access-7zxj4\") pod \"multus-6fjn7\" (UID: \"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb\") " pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.570960 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.570919 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm94c\" (UniqueName: \"kubernetes.io/projected/74c3a396-cd8f-4290-9e5d-1a182b254157-kube-api-access-qm94c\") pod \"ovnkube-node-pqvlx\" (UID: \"74c3a396-cd8f-4290-9e5d-1a182b254157\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.571116 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.571021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwpj\" (UniqueName: \"kubernetes.io/projected/d1b4f3db-7875-478c-93b8-7c3155edd974-kube-api-access-zbwpj\") pod \"multus-additional-cni-plugins-wpmrx\" (UID: \"d1b4f3db-7875-478c-93b8-7c3155edd974\") " pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.571909 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.571883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtldr\" (UniqueName: \"kubernetes.io/projected/27ff64ed-25a6-46c3-a473-b5c08e56c10f-kube-api-access-gtldr\") pod \"aws-ebs-csi-driver-node-xnvf8\" (UID: \"27ff64ed-25a6-46c3-a473-b5c08e56c10f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.733693 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.733610 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:24.743494 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.743458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7p6s9" Apr 24 21:16:24.750530 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.750492 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6fjn7" Apr 24 21:16:24.761311 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.761279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fr8rd" Apr 24 21:16:24.765942 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.765904 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" Apr 24 21:16:24.772720 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.772689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:24.780499 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.780471 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:24.789302 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.789271 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" Apr 24 21:16:24.795084 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:24.795050 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" Apr 24 21:16:25.065729 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.065694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:25.065864 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:25.065843 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:25.065939 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:25.065914 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs podName:bfa91d92-6b3a-44d1-8b22-abae9ded2a1c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:26.065897608 +0000 UTC m=+4.133503170 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs") pod "network-metrics-daemon-4xbxm" (UID: "bfa91d92-6b3a-44d1-8b22-abae9ded2a1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:25.073515 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:25.073485 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c3a396_cd8f_4290_9e5d_1a182b254157.slice/crio-4cbb41d4eb4e52f637f70b24e284fb3956f3f99386af53ecfdad4343b9473435 WatchSource:0}: Error finding container 4cbb41d4eb4e52f637f70b24e284fb3956f3f99386af53ecfdad4343b9473435: Status 404 returned error can't find the container with id 4cbb41d4eb4e52f637f70b24e284fb3956f3f99386af53ecfdad4343b9473435 Apr 24 21:16:25.074205 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:25.074180 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e1d634_7f8c_4ad8_99ff_5ba60346cbd5.slice/crio-42436a6775fa5e369faa279be88af97990ccab66d64d0c525fe9247762ea44a8 WatchSource:0}: Error finding container 42436a6775fa5e369faa279be88af97990ccab66d64d0c525fe9247762ea44a8: Status 404 returned error can't find the container with id 42436a6775fa5e369faa279be88af97990ccab66d64d0c525fe9247762ea44a8 Apr 24 21:16:25.078330 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:25.078282 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8465e7e_a69f_4eff_bf07_d8e7f8de3cdb.slice/crio-3a7f1c1eea0095ef8f9eed0be4d717794acae0ff56035fc0f4403896e9944168 WatchSource:0}: Error finding container 3a7f1c1eea0095ef8f9eed0be4d717794acae0ff56035fc0f4403896e9944168: Status 404 returned error can't find the container with id 3a7f1c1eea0095ef8f9eed0be4d717794acae0ff56035fc0f4403896e9944168 Apr 24 21:16:25.079105 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:25.079057 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647bf41d_3e57_40c7_bd39_3154d24499dd.slice/crio-a13b99971f3364001a070a7d5a59533ade5c5d6c5793aaaf4f5d1f5f3168388d WatchSource:0}: Error finding container a13b99971f3364001a070a7d5a59533ade5c5d6c5793aaaf4f5d1f5f3168388d: Status 404 returned error can't find the container with id a13b99971f3364001a070a7d5a59533ade5c5d6c5793aaaf4f5d1f5f3168388d Apr 24 21:16:25.079788 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:25.079768 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4296bff6_4cb8_423e_a188_d5e73736c322.slice/crio-a074e46b4a0755b81b476b8a0b0f188659f483eb2d0b666db5036959bfe983bf WatchSource:0}: Error finding container a074e46b4a0755b81b476b8a0b0f188659f483eb2d0b666db5036959bfe983bf: Status 404 returned error can't find the container with id a074e46b4a0755b81b476b8a0b0f188659f483eb2d0b666db5036959bfe983bf Apr 24 21:16:25.080798 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:25.080772 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27ff64ed_25a6_46c3_a473_b5c08e56c10f.slice/crio-2c0d24d9eb385d25d167719e62b9c285f7d6f542df140c94104095eb65a4357b WatchSource:0}: Error finding container 2c0d24d9eb385d25d167719e62b9c285f7d6f542df140c94104095eb65a4357b: Status 404 returned error can't find the container with id 2c0d24d9eb385d25d167719e62b9c285f7d6f542df140c94104095eb65a4357b Apr 24 21:16:25.082146 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:25.082064 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9a3a11_4bde_44e9_a68d_2d9ababf72d3.slice/crio-6a874eac824f87d10433870c23ebdadf97988a281839e3db53c8e1c605166679 WatchSource:0}: Error finding container 6a874eac824f87d10433870c23ebdadf97988a281839e3db53c8e1c605166679: Status 404 returned error can't find the container with id 6a874eac824f87d10433870c23ebdadf97988a281839e3db53c8e1c605166679 Apr 24 21:16:25.083513 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:25.083489 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b4f3db_7875_478c_93b8_7c3155edd974.slice/crio-4bdcdc0cfa284eb2fa3577f03d7cb9fc44ebafc478d1b0d4b6a05bb6b9d62325 WatchSource:0}: Error finding container 4bdcdc0cfa284eb2fa3577f03d7cb9fc44ebafc478d1b0d4b6a05bb6b9d62325: Status 404 returned error can't find the container with id 4bdcdc0cfa284eb2fa3577f03d7cb9fc44ebafc478d1b0d4b6a05bb6b9d62325 Apr 24 21:16:25.166941 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.166748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:25.166941 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:25.166937 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:25.167135 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:25.166960 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:25.167135 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:25.166971 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x9r5w for pod openshift-network-diagnostics/network-check-target-pnbsk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:25.167135 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:25.167055 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w podName:763a8f03-407c-4bd4-b683-27ba0614f163 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:26.167032852 +0000 UTC m=+4.234638408 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9r5w" (UniqueName: "kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w") pod "network-check-target-pnbsk" (UID: "763a8f03-407c-4bd4-b683-27ba0614f163") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:25.516565 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.516371 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:23 +0000 UTC" deadline="2027-12-18 02:58:19.443025219 +0000 UTC" Apr 24 21:16:25.516565 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.516413 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14453h41m53.926616693s" Apr 24 21:16:25.524120 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.523585 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:25.524120 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:25.523728 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:25.535325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.535264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" event={"ID":"27ff64ed-25a6-46c3-a473-b5c08e56c10f","Type":"ContainerStarted","Data":"2c0d24d9eb385d25d167719e62b9c285f7d6f542df140c94104095eb65a4357b"} Apr 24 21:16:25.540106 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.540055 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q9jpr" event={"ID":"647bf41d-3e57-40c7-bd39-3154d24499dd","Type":"ContainerStarted","Data":"a13b99971f3364001a070a7d5a59533ade5c5d6c5793aaaf4f5d1f5f3168388d"} Apr 24 21:16:25.545443 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.545402 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6fjn7" event={"ID":"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb","Type":"ContainerStarted","Data":"3a7f1c1eea0095ef8f9eed0be4d717794acae0ff56035fc0f4403896e9944168"} Apr 24 21:16:25.551989 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.551953 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" event={"ID":"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5","Type":"ContainerStarted","Data":"42436a6775fa5e369faa279be88af97990ccab66d64d0c525fe9247762ea44a8"} Apr 24 21:16:25.553918 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.553857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"4cbb41d4eb4e52f637f70b24e284fb3956f3f99386af53ecfdad4343b9473435"} Apr 24 21:16:25.562658 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.562616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" event={"ID":"88cded07b618f16d406fa0098b2baa8d","Type":"ContainerStarted","Data":"9ac4d6fa08fb6ffde07006c17a88a8450a3efa0d65bc469ddd3388d282a350dd"} Apr 24 21:16:25.569791 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.569717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" event={"ID":"d1b4f3db-7875-478c-93b8-7c3155edd974","Type":"ContainerStarted","Data":"4bdcdc0cfa284eb2fa3577f03d7cb9fc44ebafc478d1b0d4b6a05bb6b9d62325"} Apr 24 21:16:25.576955 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.576875 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7p6s9" event={"ID":"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3","Type":"ContainerStarted","Data":"6a874eac824f87d10433870c23ebdadf97988a281839e3db53c8e1c605166679"} Apr 24 21:16:25.586410 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:25.586368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fr8rd" event={"ID":"4296bff6-4cb8-423e-a188-d5e73736c322","Type":"ContainerStarted","Data":"a074e46b4a0755b81b476b8a0b0f188659f483eb2d0b666db5036959bfe983bf"} Apr 24 21:16:26.073485 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:26.073441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:26.073672 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:26.073593 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:26.073672 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:26.073662 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs podName:bfa91d92-6b3a-44d1-8b22-abae9ded2a1c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:28.073642352 +0000 UTC m=+6.141247922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs") pod "network-metrics-daemon-4xbxm" (UID: "bfa91d92-6b3a-44d1-8b22-abae9ded2a1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:26.174030 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:26.173990 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:26.174243 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:26.174216 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:26.174323 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:26.174250 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:26.174323 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:26.174265 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x9r5w for pod openshift-network-diagnostics/network-check-target-pnbsk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:26.174420 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:26.174335 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w podName:763a8f03-407c-4bd4-b683-27ba0614f163 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:28.174315504 +0000 UTC m=+6.241921076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9r5w" (UniqueName: "kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w") pod "network-check-target-pnbsk" (UID: "763a8f03-407c-4bd4-b683-27ba0614f163") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:26.526220 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:26.526134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:26.526660 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:26.526278 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:26.596880 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:26.596636 2578 generic.go:358] "Generic (PLEG): container finished" podID="9b4b9a04da2f64c015ad90ce08dfcded" containerID="4f77b871e6cd52914adc7c42f0bca0e1a2d2a2da9a7407ed7e0d52c4f193da46" exitCode=0 Apr 24 21:16:26.604857 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:26.603267 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" event={"ID":"9b4b9a04da2f64c015ad90ce08dfcded","Type":"ContainerDied","Data":"4f77b871e6cd52914adc7c42f0bca0e1a2d2a2da9a7407ed7e0d52c4f193da46"} Apr 24 21:16:26.617444 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:26.617387 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-219.ec2.internal" podStartSLOduration=3.617367409 podStartE2EDuration="3.617367409s" podCreationTimestamp="2026-04-24 21:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:25.576986053 +0000 UTC m=+3.644591629" watchObservedRunningTime="2026-04-24 21:16:26.617367409 +0000 UTC m=+4.684972988" Apr 24 21:16:27.524091 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:27.523489 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:27.524091 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:27.523627 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:27.601890 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:27.601852 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" event={"ID":"9b4b9a04da2f64c015ad90ce08dfcded","Type":"ContainerStarted","Data":"f16fdf146f3e4c8a9d1a6062412d5d55eb7f13c4a2a4d79cee49ded824e50c92"} Apr 24 21:16:28.093352 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:28.093270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:28.093559 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:28.093429 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:28.093559 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:28.093512 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs podName:bfa91d92-6b3a-44d1-8b22-abae9ded2a1c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:32.093491188 +0000 UTC m=+10.161096749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs") pod "network-metrics-daemon-4xbxm" (UID: "bfa91d92-6b3a-44d1-8b22-abae9ded2a1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:28.194572 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:28.194066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:28.194572 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:28.194217 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:28.194572 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:28.194235 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:28.194572 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:28.194244 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x9r5w for pod openshift-network-diagnostics/network-check-target-pnbsk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:28.194572 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:28.194288 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w podName:763a8f03-407c-4bd4-b683-27ba0614f163 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:32.19427507 +0000 UTC m=+10.261880626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9r5w" (UniqueName: "kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w") pod "network-check-target-pnbsk" (UID: "763a8f03-407c-4bd4-b683-27ba0614f163") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:28.527063 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:28.526470 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:28.527063 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:28.526624 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:29.523903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:29.523847 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:29.524397 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:29.523998 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:30.523465 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:30.523428 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:30.523650 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:30.523581 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:31.523787 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:31.523264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:31.523787 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:31.523402 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:32.128337 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:32.128287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:32.128509 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:32.128450 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:32.128572 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:32.128518 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs podName:bfa91d92-6b3a-44d1-8b22-abae9ded2a1c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:40.128497521 +0000 UTC m=+18.196103083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs") pod "network-metrics-daemon-4xbxm" (UID: "bfa91d92-6b3a-44d1-8b22-abae9ded2a1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:32.229184 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:32.229145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:32.229373 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:32.229353 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:32.229440 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:32.229375 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:32.229440 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:32.229390 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x9r5w for pod openshift-network-diagnostics/network-check-target-pnbsk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:32.229543 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:32.229456 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w podName:763a8f03-407c-4bd4-b683-27ba0614f163 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:40.229437497 +0000 UTC m=+18.297043079 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9r5w" (UniqueName: "kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w") pod "network-check-target-pnbsk" (UID: "763a8f03-407c-4bd4-b683-27ba0614f163") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:32.526607 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:32.526130 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:32.526607 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:32.526272 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:33.523293 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:33.523252 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:33.523493 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:33.523400 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:34.523703 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:34.523661 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:34.524129 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:34.523791 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:35.523407 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:35.523371 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:35.523675 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:35.523502 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:36.523719 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:36.523683 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:36.524211 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:36.523820 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:37.523411 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:37.523321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:37.523589 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:37.523449 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:38.526399 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:38.526361 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:38.526809 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:38.526472 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:39.523841 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:39.523798 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:39.524059 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:39.523942 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:40.186177 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:40.186137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:40.186648 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:40.186288 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:40.186648 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:40.186351 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs podName:bfa91d92-6b3a-44d1-8b22-abae9ded2a1c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.186335663 +0000 UTC m=+34.253941220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs") pod "network-metrics-daemon-4xbxm" (UID: "bfa91d92-6b3a-44d1-8b22-abae9ded2a1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:40.287406 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:40.287369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:40.287584 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:40.287499 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:40.287584 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:40.287513 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:40.287584 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:40.287522 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x9r5w for pod openshift-network-diagnostics/network-check-target-pnbsk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:40.287584 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:40.287570 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w podName:763a8f03-407c-4bd4-b683-27ba0614f163 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.287552531 +0000 UTC m=+34.355158091 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9r5w" (UniqueName: "kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w") pod "network-check-target-pnbsk" (UID: "763a8f03-407c-4bd4-b683-27ba0614f163") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:40.523725 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:40.523639 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:40.523893 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:40.523767 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:41.523690 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:41.523624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:41.524189 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:41.523763 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:42.204485 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.204310 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-219.ec2.internal" podStartSLOduration=19.204291164 podStartE2EDuration="19.204291164s" podCreationTimestamp="2026-04-24 21:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:27.614984689 +0000 UTC m=+5.682590271" watchObservedRunningTime="2026-04-24 21:16:42.204291164 +0000 UTC m=+20.271896744" Apr 24 21:16:42.204624 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.204612 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-c7xmt"] Apr 24 21:16:42.233219 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.233194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.233333 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:42.233265 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c7xmt" podUID="e57ddd4a-8801-45ab-b463-3398cdeea471" Apr 24 21:16:42.303785 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.303564 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e57ddd4a-8801-45ab-b463-3398cdeea471-dbus\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.303785 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.303665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.303785 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.303699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e57ddd4a-8801-45ab-b463-3398cdeea471-kubelet-config\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.407336 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.407292 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e57ddd4a-8801-45ab-b463-3398cdeea471-dbus\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.407514 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.407403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.407514 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.407438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e57ddd4a-8801-45ab-b463-3398cdeea471-kubelet-config\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.407652 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.407538 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e57ddd4a-8801-45ab-b463-3398cdeea471-kubelet-config\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.407711 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.407660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e57ddd4a-8801-45ab-b463-3398cdeea471-dbus\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.407789 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:42.407767 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:42.407847 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:42.407838 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret podName:e57ddd4a-8801-45ab-b463-3398cdeea471 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:42.907818967 +0000 UTC m=+20.975424526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret") pod "global-pull-secret-syncer-c7xmt" (UID: "e57ddd4a-8801-45ab-b463-3398cdeea471") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:42.525175 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.524917 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:42.525175 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:42.525112 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:42.629562 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.629394 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1b4f3db-7875-478c-93b8-7c3155edd974" containerID="c0b6105b1a38c6ada7f757b1ccadd014fc86802a7eb5925491f3b78477ebb058" exitCode=0 Apr 24 21:16:42.629738 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.629476 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" event={"ID":"d1b4f3db-7875-478c-93b8-7c3155edd974","Type":"ContainerDied","Data":"c0b6105b1a38c6ada7f757b1ccadd014fc86802a7eb5925491f3b78477ebb058"} Apr 24 21:16:42.631082 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.630975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7p6s9" event={"ID":"5f9a3a11-4bde-44e9-a68d-2d9ababf72d3","Type":"ContainerStarted","Data":"aafba078b77a638e161158537d64a742f12b0b617e8d56b826705eb6f40b9fdd"} Apr 24 21:16:42.632942 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.632887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" event={"ID":"27ff64ed-25a6-46c3-a473-b5c08e56c10f","Type":"ContainerStarted","Data":"5ae35945ffb0eee86ff8d0c537a6ed9db94573b04cd20390dcce3edacd26062e"} Apr 24 21:16:42.634329 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.634300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q9jpr" event={"ID":"647bf41d-3e57-40c7-bd39-3154d24499dd","Type":"ContainerStarted","Data":"7dcf792d758749009fbdffedee742ed564b2648d040a7a03c0af08297f9d5d0d"} Apr 24 21:16:42.635795 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.635765 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6fjn7" event={"ID":"b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb","Type":"ContainerStarted","Data":"701125e11adfe815deb6983e4b61667486b9e1e01f3ee81784078f3cf63f5d10"} Apr 24 21:16:42.637291 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.637268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" event={"ID":"b3e1d634-7f8c-4ad8-99ff-5ba60346cbd5","Type":"ContainerStarted","Data":"a518034f91e400df2341e2df1379aac0fee02ba44891d7fb7088ac7d1d6e7c39"} Apr 24 21:16:42.639112 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.639091 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"68dd3e1bce2ddd4c450ddb23485647921fa3c61559c48ebe42b34a6cd6706e7c"} Apr 24 21:16:42.639183 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.639115 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"45e1649cca402692e156f620a4871944adfce63850d3cfe248c0976ee2d95575"} Apr 24 21:16:42.639183 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.639126 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"ce59efea7488b7da4224609314f240f15ef5d6e4b25ebf95ca5f7c89707533e9"} Apr 24 21:16:42.661568 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.661515 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7p6s9" podStartSLOduration=8.484197062 podStartE2EDuration="20.661495758s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.084483424 +0000 UTC m=+3.152088997" lastFinishedPulling="2026-04-24 21:16:37.261782136 +0000 UTC m=+15.329387693" observedRunningTime="2026-04-24 21:16:42.661235434 +0000 UTC m=+20.728841026" watchObservedRunningTime="2026-04-24 21:16:42.661495758 +0000 UTC m=+20.729101337" Apr 24 21:16:42.675450 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.675383 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q9jpr" podStartSLOduration=3.932309988 podStartE2EDuration="20.675371764s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.081503239 +0000 UTC m=+3.149108796" lastFinishedPulling="2026-04-24 21:16:41.824565013 +0000 UTC m=+19.892170572" observedRunningTime="2026-04-24 21:16:42.675181448 +0000 UTC m=+20.742787026" watchObservedRunningTime="2026-04-24 21:16:42.675371764 +0000 UTC m=+20.742977322" Apr 24 21:16:42.705410 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.705254 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-h5rbb" podStartSLOduration=3.555496084 podStartE2EDuration="20.705233515s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.076978846 +0000 UTC m=+3.144584403" lastFinishedPulling="2026-04-24 21:16:42.226716277 +0000 UTC m=+20.294321834" observedRunningTime="2026-04-24 21:16:42.688760879 +0000 UTC m=+20.756366676" watchObservedRunningTime="2026-04-24 21:16:42.705233515 +0000 UTC m=+20.772839097" Apr 24 21:16:42.706640 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.705898 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6fjn7" podStartSLOduration=3.5494632 podStartE2EDuration="20.705551069s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.080730016 +0000 UTC m=+3.148335574" lastFinishedPulling="2026-04-24 21:16:42.236817886 +0000 UTC m=+20.304423443" observedRunningTime="2026-04-24 21:16:42.702697231 +0000 UTC m=+20.770302810" watchObservedRunningTime="2026-04-24 21:16:42.705551069 +0000 UTC m=+20.773156652" Apr 24 21:16:42.911714 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:42.911676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:42.911870 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:42.911806 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:42.911870 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:42.911867 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret podName:e57ddd4a-8801-45ab-b463-3398cdeea471 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:43.911850067 +0000 UTC m=+21.979455626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret") pod "global-pull-secret-syncer-c7xmt" (UID: "e57ddd4a-8801-45ab-b463-3398cdeea471") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:43.485533 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.485504 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:16:43.522769 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.522610 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:16:43.485527202Z","UUID":"f4c640a1-4d89-4bf7-a17f-0149bafacfb7","Handler":null,"Name":"","Endpoint":""} Apr 24 21:16:43.523025 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.523004 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:43.523122 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:43.523107 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:43.524897 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.524872 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:16:43.525045 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.524904 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:16:43.646347 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.646313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" event={"ID":"27ff64ed-25a6-46c3-a473-b5c08e56c10f","Type":"ContainerStarted","Data":"5517d0285470af88ec160bc1fd34a60afb70320aaba6bf1851de3c87367dd59c"} Apr 24 21:16:43.649343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.649307 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"f39608ebcb778fa9c48fc6a21347068fc9b7f3c5263bd05459c6316e8ff1f652"} Apr 24 21:16:43.649476 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.649351 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"69cbdcb18a81449e1cc5e84dd016bf8236bc54228aa9af7176f9227fe63e4784"} Apr 24 21:16:43.649476 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.649367 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"6153f84306273e09c1a92e5820cfaa6acc1330f2864627253e63079361c4ba73"} Apr 24 21:16:43.918367 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:43.918324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:43.918585 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:43.918563 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:43.918652 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:43.918640 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret podName:e57ddd4a-8801-45ab-b463-3398cdeea471 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:45.918621349 +0000 UTC m=+23.986226907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret") pod "global-pull-secret-syncer-c7xmt" (UID: "e57ddd4a-8801-45ab-b463-3398cdeea471") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:44.523605 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:44.523472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:44.523809 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:44.523632 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:44.523809 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:44.523688 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:44.523809 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:44.523795 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c7xmt" podUID="e57ddd4a-8801-45ab-b463-3398cdeea471" Apr 24 21:16:44.653257 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:44.653215 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fr8rd" event={"ID":"4296bff6-4cb8-423e-a188-d5e73736c322","Type":"ContainerStarted","Data":"cd8e81ff6741c7cc554cb71944a739b3b8a98afb24bfffe29049c482a01cf5d2"} Apr 24 21:16:44.655527 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:44.655494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" event={"ID":"27ff64ed-25a6-46c3-a473-b5c08e56c10f","Type":"ContainerStarted","Data":"ca9e02f424dc1214471582ef18ac535b1f657adf81a43b0006759dda3c88ea33"} Apr 24 21:16:44.680794 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:44.680736 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fr8rd" podStartSLOduration=5.559004511 podStartE2EDuration="22.680715801s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.081913795 +0000 UTC m=+3.149519357" lastFinishedPulling="2026-04-24 21:16:42.203625076 +0000 UTC m=+20.271230647" observedRunningTime="2026-04-24 21:16:44.680431696 +0000 UTC m=+22.748037274" watchObservedRunningTime="2026-04-24 21:16:44.680715801 +0000 UTC m=+22.748321381" Apr 24 21:16:45.523833 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:45.523793 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:45.524045 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:45.523965 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:45.661227 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:45.661173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"42497a0cb97b103313fc6aeb40a977fbd01254104af03872e776c1c3f12d91d5"} Apr 24 21:16:45.934767 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:45.934714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:45.934980 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:45.934842 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:45.934980 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:45.934908 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret podName:e57ddd4a-8801-45ab-b463-3398cdeea471 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.934890196 +0000 UTC m=+28.002495775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret") pod "global-pull-secret-syncer-c7xmt" (UID: "e57ddd4a-8801-45ab-b463-3398cdeea471") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:46.523374 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:46.523341 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:46.523541 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:46.523341 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:46.523541 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:46.523468 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c7xmt" podUID="e57ddd4a-8801-45ab-b463-3398cdeea471" Apr 24 21:16:46.523616 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:46.523592 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:47.132511 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.132426 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:47.134181 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.133701 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:47.154280 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.154235 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xnvf8" podStartSLOduration=6.019058047 podStartE2EDuration="25.15421764s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.08319159 +0000 UTC m=+3.150797147" lastFinishedPulling="2026-04-24 21:16:44.218351178 +0000 UTC m=+22.285956740" observedRunningTime="2026-04-24 21:16:44.696137023 +0000 UTC m=+22.763742604" watchObservedRunningTime="2026-04-24 21:16:47.15421764 +0000 UTC m=+25.221823218" Apr 24 21:16:47.523606 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.523574 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:47.523767 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:47.523681 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:47.667649 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.667612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" event={"ID":"74c3a396-cd8f-4290-9e5d-1a182b254157","Type":"ContainerStarted","Data":"4d758ebf62d43d939acc4a1e8cd2f7eaf31c46766c14de0d6cefe46b0bf0aa29"} Apr 24 21:16:47.668012 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.667937 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:47.669492 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.669464 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1b4f3db-7875-478c-93b8-7c3155edd974" containerID="5bc94a0a88b0553bb268999baf7661b387426129e270195fc67c9fd0438f73ac" exitCode=0 Apr 24 21:16:47.669617 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.669528 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" event={"ID":"d1b4f3db-7875-478c-93b8-7c3155edd974","Type":"ContainerDied","Data":"5bc94a0a88b0553bb268999baf7661b387426129e270195fc67c9fd0438f73ac"} Apr 24 21:16:47.669744 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.669725 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:47.670420 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.670355 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q9jpr" Apr 24 21:16:47.684434 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.684401 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:47.713720 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:47.713663 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" podStartSLOduration=8.472081776 podStartE2EDuration="25.713648317s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.075861782 +0000 UTC m=+3.143467340" lastFinishedPulling="2026-04-24 21:16:42.317428321 +0000 UTC m=+20.385033881" observedRunningTime="2026-04-24 21:16:47.713073466 +0000 UTC m=+25.780679042" watchObservedRunningTime="2026-04-24 21:16:47.713648317 +0000 UTC m=+25.781253895" Apr 24 21:16:48.524129 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:48.523848 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:48.524642 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:48.523898 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:48.524642 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:48.524169 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c7xmt" podUID="e57ddd4a-8801-45ab-b463-3398cdeea471" Apr 24 21:16:48.524642 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:48.524233 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:48.672904 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:48.672869 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1b4f3db-7875-478c-93b8-7c3155edd974" containerID="99fb6b7e74e481c6eb016a7f5e7286b3d8cd66ed75d35a9d0a16b654a4c3d245" exitCode=0 Apr 24 21:16:48.673078 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:48.672965 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" event={"ID":"d1b4f3db-7875-478c-93b8-7c3155edd974","Type":"ContainerDied","Data":"99fb6b7e74e481c6eb016a7f5e7286b3d8cd66ed75d35a9d0a16b654a4c3d245"} Apr 24 21:16:48.673245 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:48.673212 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:48.673768 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:48.673699 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:48.688905 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:48.688872 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:16:49.092336 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.092303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c7xmt"] Apr 24 21:16:49.092497 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.092451 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:49.092578 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:49.092550 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c7xmt" podUID="e57ddd4a-8801-45ab-b463-3398cdeea471" Apr 24 21:16:49.095574 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.095544 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pnbsk"] Apr 24 21:16:49.095716 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.095660 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:49.095775 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:49.095740 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:49.101698 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.101670 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4xbxm"] Apr 24 21:16:49.101861 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.101808 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:49.101957 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:49.101918 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:49.676725 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.676684 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1b4f3db-7875-478c-93b8-7c3155edd974" containerID="3a4c451e93dd9b3ccf3b2773c0a47873e6b9963472dca9038fa0e9e2bdc3dbb3" exitCode=0 Apr 24 21:16:49.677176 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.676780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" event={"ID":"d1b4f3db-7875-478c-93b8-7c3155edd974","Type":"ContainerDied","Data":"3a4c451e93dd9b3ccf3b2773c0a47873e6b9963472dca9038fa0e9e2bdc3dbb3"} Apr 24 21:16:49.965441 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:49.965357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:49.965575 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:49.965472 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:49.965575 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:49.965531 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret podName:e57ddd4a-8801-45ab-b463-3398cdeea471 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.965515384 +0000 UTC m=+36.033120942 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret") pod "global-pull-secret-syncer-c7xmt" (UID: "e57ddd4a-8801-45ab-b463-3398cdeea471") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:50.523307 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:50.523270 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:50.523488 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:50.523278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:50.523488 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:50.523283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:50.523488 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:50.523444 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c7xmt" podUID="e57ddd4a-8801-45ab-b463-3398cdeea471" Apr 24 21:16:50.523689 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:50.523542 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:50.523689 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:50.523640 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:52.524546 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:52.524505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:52.525155 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:52.524608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:52.525155 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:52.524642 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:52.525155 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:52.524672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:52.525155 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:52.524719 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:52.525155 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:52.524794 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c7xmt" podUID="e57ddd4a-8801-45ab-b463-3398cdeea471" Apr 24 21:16:54.524072 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:54.524034 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:54.524072 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:54.524067 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:54.524476 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:54.524039 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:54.524476 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:54.524195 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c7xmt" podUID="e57ddd4a-8801-45ab-b463-3398cdeea471" Apr 24 21:16:54.524476 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:54.524221 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pnbsk" podUID="763a8f03-407c-4bd4-b683-27ba0614f163" Apr 24 21:16:54.524476 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:54.524284 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xbxm" podUID="bfa91d92-6b3a-44d1-8b22-abae9ded2a1c" Apr 24 21:16:55.255572 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.255541 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-219.ec2.internal" event="NodeReady" Apr 24 21:16:55.255740 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.255691 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:16:55.307761 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.307719 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wxcvf"] Apr 24 21:16:55.333462 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.333428 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6948fb494f-4hnkq"] Apr 24 21:16:55.333706 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.333600 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.348660 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.348618 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz"] Apr 24 21:16:55.348812 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.348711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.362993 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.362962 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:16:55.363499 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.363246 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:16:55.363499 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.363266 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.363499 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.363294 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.363499 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.363248 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:16:55.363499 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.363248 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qrf4x\"" Apr 24 21:16:55.363499 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.363296 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:16:55.364543 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.364511 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dqkgc\"" Apr 24 21:16:55.364653 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.364557 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:16:55.365337 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.365146 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9"] Apr 24 21:16:55.365337 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.365284 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.375241 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.375217 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.375460 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.375439 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:16:55.375539 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.375514 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:16:55.375599 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.375447 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.376327 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.376305 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-zqbg5\"" Apr 24 21:16:55.381138 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.381110 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:16:55.382449 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.382413 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:16:55.384210 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.384186 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wxcvf"] Apr 24 21:16:55.384210 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.384216 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz"] Apr 24 21:16:55.384397 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.384230 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9"] Apr 24 21:16:55.384397 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.384321 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b"] Apr 24 21:16:55.384397 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.384346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9" Apr 24 21:16:55.389050 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.388602 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-k4hqs\"" Apr 24 21:16:55.389050 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.388880 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.389268 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.389119 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.406171 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.406137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4rcb\" (UniqueName: \"kubernetes.io/projected/158622c6-3d53-4f0c-bd64-00a4f1f32d69-kube-api-access-z4rcb\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.406515 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.406188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158622c6-3d53-4f0c-bd64-00a4f1f32d69-config\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.406515 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.406215 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/158622c6-3d53-4f0c-bd64-00a4f1f32d69-trusted-ca\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.406647 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.406567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158622c6-3d53-4f0c-bd64-00a4f1f32d69-serving-cert\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.406746 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.406713 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bm7ns"] Apr 24 21:16:55.406970 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.406949 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.409205 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.409180 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:16:55.410389 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.410363 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.410569 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.410419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-wz2s5\"" Apr 24 21:16:55.410569 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.410493 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:16:55.410765 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.410746 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.425302 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.425264 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8"] Apr 24 21:16:55.425465 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.425438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.428218 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.428193 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.428394 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.428378 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.428468 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.428415 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:16:55.428517 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.428419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-ttcnb\"" Apr 24 21:16:55.429251 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.429229 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:16:55.434047 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.433882 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:16:55.444264 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.444232 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x"] Apr 24 21:16:55.444440 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.444395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" Apr 24 21:16:55.446998 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.446968 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:16:55.447252 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.447218 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:16:55.447363 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.447271 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4cwt8\"" Apr 24 21:16:55.447363 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.447218 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.449227 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.449202 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.457186 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.457156 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc"] Apr 24 21:16:55.457378 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.457360 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.465294 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.465229 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:16:55.465456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.465309 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:16:55.465456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.465342 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-t9r9d\"" Apr 24 21:16:55.465456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.465229 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.465728 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.465712 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.473419 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.472172 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5b7858ff9b-qlsj4"] Apr 24 21:16:55.473419 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.472598 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:55.480970 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.480944 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-2cbt7\"" Apr 24 21:16:55.481224 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.481006 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.481319 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.481054 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.481319 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.481094 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:16:55.492851 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.492821 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mxfwh"] Apr 24 21:16:55.493059 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.493037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.503206 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.503175 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:16:55.503357 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.503272 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.503425 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.503175 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.504156 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.504009 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:16:55.504156 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.504040 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:16:55.504639 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.504617 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:16:55.505123 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.505102 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sbnww\"" Apr 24 21:16:55.506986 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.506904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/158622c6-3d53-4f0c-bd64-00a4f1f32d69-trusted-ca\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.506986 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.506958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-bound-sa-token\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.507154 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.506989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-trusted-ca\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.507154 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtc9x\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-kube-api-access-wtc9x\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.507154 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507040 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.507154 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.507340 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507153 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w684w\" (UniqueName: \"kubernetes.io/projected/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-kube-api-access-w684w\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.507340 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507173 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11dd689a-4bd3-402d-99a1-0a89fed0b025-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.507340 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-installation-pull-secrets\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.507340 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-service-ca-bundle\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.507340 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-ca-trust-extracted\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.507340 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507313 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgzwq\" (UniqueName: \"kubernetes.io/projected/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-kube-api-access-dgzwq\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.507340 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-snapshots\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.507618 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48n5w\" (UniqueName: \"kubernetes.io/projected/b7e9209d-9d38-4fb0-a8ad-003c896fe276-kube-api-access-48n5w\") pod \"volume-data-source-validator-7c6cbb6c87-6l5v9\" (UID: \"b7e9209d-9d38-4fb0-a8ad-003c896fe276\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9" Apr 24 21:16:55.507618 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158622c6-3d53-4f0c-bd64-00a4f1f32d69-serving-cert\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.507618 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507384 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.507618 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507420 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.507618 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-tmp\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.507618 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4rcb\" (UniqueName: \"kubernetes.io/projected/158622c6-3d53-4f0c-bd64-00a4f1f32d69-kube-api-access-z4rcb\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.507618 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-image-registry-private-configuration\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.507618 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-certificates\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.507965 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hct5\" (UniqueName: \"kubernetes.io/projected/11dd689a-4bd3-402d-99a1-0a89fed0b025-kube-api-access-7hct5\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.507965 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11dd689a-4bd3-402d-99a1-0a89fed0b025-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.507965 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158622c6-3d53-4f0c-bd64-00a4f1f32d69-config\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.507965 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-serving-cert\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.507965 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.507962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/158622c6-3d53-4f0c-bd64-00a4f1f32d69-trusted-ca\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.508510 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.508485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158622c6-3d53-4f0c-bd64-00a4f1f32d69-config\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.511407 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.511359 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8"] Apr 24 21:16:55.511526 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.511442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:55.512866 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.512682 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158622c6-3d53-4f0c-bd64-00a4f1f32d69-serving-cert\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.515310 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.515092 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.515310 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.515244 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:16:55.515310 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.515248 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6g56g\"" Apr 24 21:16:55.515310 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.515249 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.518323 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.518287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4rcb\" (UniqueName: \"kubernetes.io/projected/158622c6-3d53-4f0c-bd64-00a4f1f32d69-kube-api-access-z4rcb\") pod \"console-operator-9d4b6777b-wxcvf\" (UID: \"158622c6-3d53-4f0c-bd64-00a4f1f32d69\") " pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.530325 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.530296 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w"] Apr 24 21:16:55.530654 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.530525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.535109 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.535086 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:16:55.542898 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.542874 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4"] Apr 24 21:16:55.543043 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.543025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w" Apr 24 21:16:55.545476 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.545451 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-wqlvk\"" Apr 24 21:16:55.545593 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.545501 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.545907 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.545887 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.560662 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560623 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6948fb494f-4hnkq"] Apr 24 21:16:55.560662 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560655 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b"] Apr 24 21:16:55.560662 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560664 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560679 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560687 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bm7ns"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560695 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560704 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560712 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b7858ff9b-qlsj4"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560722 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560730 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mxfwh"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560737 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560750 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v4ffd"] Apr 24 21:16:55.560879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.560779 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.563548 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.563524 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:16:55.563787 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.563772 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:16:55.564003 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.563988 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:16:55.564091 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.564014 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:16:55.575152 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.575123 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v4ffd"] Apr 24 21:16:55.575331 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.575299 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.577777 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.577756 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:16:55.577913 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.577756 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:16:55.578008 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.577986 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9vgbs\"" Apr 24 21:16:55.578164 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.578149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.578241 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.578228 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.609168 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-certificates\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.609343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609182 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-tmp\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.609343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609222 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtfc\" (UniqueName: \"kubernetes.io/projected/c034d95c-ec37-4478-a57c-8027d3a83be5-kube-api-access-7wtfc\") pod \"managed-serviceaccount-addon-agent-764588ff59-qsth8\" (UID: \"c034d95c-ec37-4478-a57c-8027d3a83be5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" Apr 24 21:16:55.609343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c034d95c-ec37-4478-a57c-8027d3a83be5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764588ff59-qsth8\" (UID: \"c034d95c-ec37-4478-a57c-8027d3a83be5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" Apr 24 21:16:55.609343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11dd689a-4bd3-402d-99a1-0a89fed0b025-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.609498 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-serving-cert\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.609498 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8td\" (UniqueName: \"kubernetes.io/projected/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-kube-api-access-6k8td\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.609498 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtc9x\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-kube-api-access-wtc9x\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.609498 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.609498 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-default-certificate\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.609729 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w684w\" (UniqueName: \"kubernetes.io/projected/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-kube-api-access-w684w\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.609729 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-installation-pull-secrets\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.609729 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.609609 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:55.609729 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609618 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-config\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.609729 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgzwq\" (UniqueName: \"kubernetes.io/projected/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-kube-api-access-dgzwq\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.609729 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.609683 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls podName:fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.109665428 +0000 UTC m=+34.177270985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bhp8b" (UID: "fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-certificates\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11dd689a-4bd3-402d-99a1-0a89fed0b025-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-snapshots\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mgph\" (UniqueName: \"kubernetes.io/projected/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-kube-api-access-2mgph\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609900 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.609989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-tmp\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.610064 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610069 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-klusterlet-config\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-image-registry-private-configuration\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hct5\" (UniqueName: \"kubernetes.io/projected/11dd689a-4bd3-402d-99a1-0a89fed0b025-kube-api-access-7hct5\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-bound-sa-token\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc4sr\" (UniqueName: \"kubernetes.io/projected/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-kube-api-access-fc4sr\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-tmp\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-trusted-ca\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-stats-auth\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-snapshots\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11dd689a-4bd3-402d-99a1-0a89fed0b025-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.610534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.610522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvq6g\" (UniqueName: \"kubernetes.io/projected/7321b947-9a04-45eb-a042-a216d960cbb7-kube-api-access-gvq6g\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:55.612222 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.612003 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-trusted-ca\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.612599 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.612576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-service-ca-bundle\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.612701 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.612650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-ca-trust-extracted\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.612757 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.612685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48n5w\" (UniqueName: \"kubernetes.io/projected/b7e9209d-9d38-4fb0-a8ad-003c896fe276-kube-api-access-48n5w\") pod \"volume-data-source-validator-7c6cbb6c87-6l5v9\" (UID: \"b7e9209d-9d38-4fb0-a8ad-003c896fe276\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9" Apr 24 21:16:55.612757 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.612733 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:55.612847 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.612785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.612847 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.612820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmdt2\" (UniqueName: \"kubernetes.io/projected/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-kube-api-access-vmdt2\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:55.614023 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.613020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.614023 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.613339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-ca-trust-extracted\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.614023 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.613621 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:55.614023 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.613646 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6948fb494f-4hnkq: secret "image-registry-tls" not found Apr 24 21:16:55.614023 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.613721 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls podName:afad5a5d-bb00-427a-ad3b-6ad4a899dd16 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.113695077 +0000 UTC m=+34.181300636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls") pod "image-registry-6948fb494f-4hnkq" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16") : secret "image-registry-tls" not found Apr 24 21:16:55.614339 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.614147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-installation-pull-secrets\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.614392 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.614348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11dd689a-4bd3-402d-99a1-0a89fed0b025-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.614477 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.614459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-image-registry-private-configuration\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.617801 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.617764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.618015 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.617995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-serving-cert\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.618212 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.618190 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-service-ca-bundle\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.623738 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.623711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgzwq\" (UniqueName: \"kubernetes.io/projected/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-kube-api-access-dgzwq\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:55.623845 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.623820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtc9x\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-kube-api-access-wtc9x\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.625773 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.625749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-bound-sa-token\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:55.628563 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.628536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hct5\" (UniqueName: \"kubernetes.io/projected/11dd689a-4bd3-402d-99a1-0a89fed0b025-kube-api-access-7hct5\") pod \"kube-storage-version-migrator-operator-6769c5d45-2dptz\" (UID: \"11dd689a-4bd3-402d-99a1-0a89fed0b025\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.628692 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.628594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w684w\" (UniqueName: \"kubernetes.io/projected/4306556d-2b6a-4b8d-b6ef-8342d2ce44d8-kube-api-access-w684w\") pod \"insights-operator-585dfdc468-bm7ns\" (UID: \"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8\") " pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.629356 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.629337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48n5w\" (UniqueName: \"kubernetes.io/projected/b7e9209d-9d38-4fb0-a8ad-003c896fe276-kube-api-access-48n5w\") pod \"volume-data-source-validator-7c6cbb6c87-6l5v9\" (UID: \"b7e9209d-9d38-4fb0-a8ad-003c896fe276\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9" Apr 24 21:16:55.647538 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.647501 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:16:55.677955 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.677896 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" Apr 24 21:16:55.697595 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.697461 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9" Apr 24 21:16:55.713660 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc4sr\" (UniqueName: \"kubernetes.io/projected/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-kube-api-access-fc4sr\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.713791 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-stats-auth\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.713791 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvq6g\" (UniqueName: \"kubernetes.io/projected/7321b947-9a04-45eb-a042-a216d960cbb7-kube-api-access-gvq6g\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:55.713791 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:55.713957 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmdt2\" (UniqueName: \"kubernetes.io/projected/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-kube-api-access-vmdt2\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:55.713957 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfq86\" (UniqueName: \"kubernetes.io/projected/5e304870-95cc-4401-8456-8388b7b9d759-kube-api-access-lfq86\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.713957 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-tmp\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.713957 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-ca\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.713957 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.713940 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:55.714182 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.714034 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls podName:7e42dfc6-9944-4e7e-a1d7-656b2871ff67 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.214010523 +0000 UTC m=+34.281616083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lx5kc" (UID: "7e42dfc6-9944-4e7e-a1d7-656b2871ff67") : secret "samples-operator-tls" not found Apr 24 21:16:55.714182 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.713945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtfc\" (UniqueName: \"kubernetes.io/projected/c034d95c-ec37-4478-a57c-8027d3a83be5-kube-api-access-7wtfc\") pod \"managed-serviceaccount-addon-agent-764588ff59-qsth8\" (UID: \"c034d95c-ec37-4478-a57c-8027d3a83be5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" Apr 24 21:16:55.714182 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c034d95c-ec37-4478-a57c-8027d3a83be5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764588ff59-qsth8\" (UID: \"c034d95c-ec37-4478-a57c-8027d3a83be5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" Apr 24 21:16:55.714182 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714168 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8td\" (UniqueName: \"kubernetes.io/projected/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-kube-api-access-6k8td\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.714358 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714204 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-hub\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.714358 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns4l2\" (UniqueName: \"kubernetes.io/projected/dd6441e0-b9f0-483a-95f8-56bff0d86e71-kube-api-access-ns4l2\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.714358 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-default-certificate\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.714358 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.714358 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.714595 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714370 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-config\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.714595 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntsmj\" (UniqueName: \"kubernetes.io/projected/8408d184-7b3a-4075-9db8-5cb9fae0821c-kube-api-access-ntsmj\") pod \"network-check-source-8894fc9bd-2hr4w\" (UID: \"8408d184-7b3a-4075-9db8-5cb9fae0821c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w" Apr 24 21:16:55.714595 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mgph\" (UniqueName: \"kubernetes.io/projected/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-kube-api-access-2mgph\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.714595 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:55.714595 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714542 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-tmp\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.714833 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.714833 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.714705 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:55.714833 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.714833 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.714769 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert podName:7321b947-9a04-45eb-a042-a216d960cbb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.214754871 +0000 UTC m=+34.282360428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert") pod "ingress-canary-mxfwh" (UID: "7321b947-9a04-45eb-a042-a216d960cbb7") : secret "canary-serving-cert" not found Apr 24 21:16:55.714833 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.714833 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.714825 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.21481869 +0000 UTC m=+34.282424246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:55.715162 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-klusterlet-config\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.715162 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714875 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd6441e0-b9f0-483a-95f8-56bff0d86e71-config-volume\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.715162 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.714881 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:55.715162 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.715162 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5e304870-95cc-4401-8456-8388b7b9d759-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.715162 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-config\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.715162 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.714958 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.214941879 +0000 UTC m=+34.282547453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : secret "router-metrics-certs-default" not found Apr 24 21:16:55.715162 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.714995 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dd6441e0-b9f0-483a-95f8-56bff0d86e71-tmp-dir\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.717485 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.717423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-stats-auth\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.718036 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.717990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-default-certificate\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.718036 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.718001 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c034d95c-ec37-4478-a57c-8027d3a83be5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764588ff59-qsth8\" (UID: \"c034d95c-ec37-4478-a57c-8027d3a83be5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" Apr 24 21:16:55.718285 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.718264 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.718411 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.718393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-klusterlet-config\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.733896 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.733863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8td\" (UniqueName: \"kubernetes.io/projected/b3a09e1e-8c44-4778-a4f7-c54bd1cb4171-kube-api-access-6k8td\") pod \"klusterlet-addon-workmgr-85c4b8b497-2tnb8\" (UID: \"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.734509 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.733863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmdt2\" (UniqueName: \"kubernetes.io/projected/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-kube-api-access-vmdt2\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:55.734785 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.733973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvq6g\" (UniqueName: \"kubernetes.io/projected/7321b947-9a04-45eb-a042-a216d960cbb7-kube-api-access-gvq6g\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:55.735388 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.735361 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtfc\" (UniqueName: \"kubernetes.io/projected/c034d95c-ec37-4478-a57c-8027d3a83be5-kube-api-access-7wtfc\") pod \"managed-serviceaccount-addon-agent-764588ff59-qsth8\" (UID: \"c034d95c-ec37-4478-a57c-8027d3a83be5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" Apr 24 21:16:55.735494 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.735465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc4sr\" (UniqueName: \"kubernetes.io/projected/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-kube-api-access-fc4sr\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:55.736835 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.736787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mgph\" (UniqueName: \"kubernetes.io/projected/940bb44f-905e-43d7-a9d4-7cb49ef94c0e-kube-api-access-2mgph\") pod \"service-ca-operator-d6fc45fc5-l4p7x\" (UID: \"940bb44f-905e-43d7-a9d4-7cb49ef94c0e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.737047 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.736998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-bm7ns" Apr 24 21:16:55.768519 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.767594 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" Apr 24 21:16:55.782538 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.780363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820317 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfq86\" (UniqueName: \"kubernetes.io/projected/5e304870-95cc-4401-8456-8388b7b9d759-kube-api-access-lfq86\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-ca\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-hub\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns4l2\" (UniqueName: \"kubernetes.io/projected/dd6441e0-b9f0-483a-95f8-56bff0d86e71-kube-api-access-ns4l2\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820483 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntsmj\" (UniqueName: \"kubernetes.io/projected/8408d184-7b3a-4075-9db8-5cb9fae0821c-kube-api-access-ntsmj\") pod \"network-check-source-8894fc9bd-2hr4w\" (UID: \"8408d184-7b3a-4075-9db8-5cb9fae0821c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd6441e0-b9f0-483a-95f8-56bff0d86e71-config-volume\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5e304870-95cc-4401-8456-8388b7b9d759-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.820712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dd6441e0-b9f0-483a-95f8-56bff0d86e71-tmp-dir\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.822320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.821214 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dd6441e0-b9f0-483a-95f8-56bff0d86e71-tmp-dir\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.825983 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.823167 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:55.825983 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:55.823244 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls podName:dd6441e0-b9f0-483a-95f8-56bff0d86e71 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.323222779 +0000 UTC m=+34.390828350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls") pod "dns-default-v4ffd" (UID: "dd6441e0-b9f0-483a-95f8-56bff0d86e71") : secret "dns-default-metrics-tls" not found Apr 24 21:16:55.825983 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.825123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd6441e0-b9f0-483a-95f8-56bff0d86e71-config-volume\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.825983 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.825889 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5e304870-95cc-4401-8456-8388b7b9d759-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.827731 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.827670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-hub\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.830683 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.830607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.839584 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.839510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-ca\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.840193 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.840135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e304870-95cc-4401-8456-8388b7b9d759-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.841541 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.840763 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:16:55.845038 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.844963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntsmj\" (UniqueName: \"kubernetes.io/projected/8408d184-7b3a-4075-9db8-5cb9fae0821c-kube-api-access-ntsmj\") pod \"network-check-source-8894fc9bd-2hr4w\" (UID: \"8408d184-7b3a-4075-9db8-5cb9fae0821c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w" Apr 24 21:16:55.845038 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.845004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfq86\" (UniqueName: \"kubernetes.io/projected/5e304870-95cc-4401-8456-8388b7b9d759-kube-api-access-lfq86\") pod \"cluster-proxy-proxy-agent-5df456cfb5-vv7p4\" (UID: \"5e304870-95cc-4401-8456-8388b7b9d759\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.851694 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.851236 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w" Apr 24 21:16:55.851694 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.851433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns4l2\" (UniqueName: \"kubernetes.io/projected/dd6441e0-b9f0-483a-95f8-56bff0d86e71-kube-api-access-ns4l2\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:55.859439 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.858114 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hdlfg"] Apr 24 21:16:55.871041 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.870812 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:16:55.879629 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.879574 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wxcvf"] Apr 24 21:16:55.879791 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.879760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:55.887337 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.887309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4rdlg\"" Apr 24 21:16:55.907246 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.907215 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz"] Apr 24 21:16:55.914627 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.914594 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9"] Apr 24 21:16:55.935572 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.935537 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bm7ns"] Apr 24 21:16:55.958138 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:55.958090 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158622c6_3d53_4f0c_bd64_00a4f1f32d69.slice/crio-c97efa994a445a2ccaa8508cba56e9d58678a70a37813fbf3d75cbaed198db2c WatchSource:0}: Error finding container c97efa994a445a2ccaa8508cba56e9d58678a70a37813fbf3d75cbaed198db2c: Status 404 returned error can't find the container with id c97efa994a445a2ccaa8508cba56e9d58678a70a37813fbf3d75cbaed198db2c Apr 24 21:16:55.958674 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:55.958644 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11dd689a_4bd3_402d_99a1_0a89fed0b025.slice/crio-8508117722a3b27cd3dcd1ba24aa66691b41b48ba151b847bde4dfae59065a5c WatchSource:0}: Error finding container 8508117722a3b27cd3dcd1ba24aa66691b41b48ba151b847bde4dfae59065a5c: Status 404 returned error can't find the container with id 8508117722a3b27cd3dcd1ba24aa66691b41b48ba151b847bde4dfae59065a5c Apr 24 21:16:55.959578 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:55.959542 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e9209d_9d38_4fb0_a8ad_003c896fe276.slice/crio-914096b80a2e8ef95c99f7c938cd3f920ad18f48dd21b16f72dbf7bcab215afa WatchSource:0}: Error finding container 914096b80a2e8ef95c99f7c938cd3f920ad18f48dd21b16f72dbf7bcab215afa: Status 404 returned error can't find the container with id 914096b80a2e8ef95c99f7c938cd3f920ad18f48dd21b16f72dbf7bcab215afa Apr 24 21:16:55.960549 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:55.960400 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4306556d_2b6a_4b8d_b6ef_8342d2ce44d8.slice/crio-4502d8e8841be2319726d566dfcceef41b175fd6d686b04a291ba1869c87edb8 WatchSource:0}: Error finding container 4502d8e8841be2319726d566dfcceef41b175fd6d686b04a291ba1869c87edb8: Status 404 returned error can't find the container with id 4502d8e8841be2319726d566dfcceef41b175fd6d686b04a291ba1869c87edb8 Apr 24 21:16:55.964279 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:55.964142 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8"] Apr 24 21:16:55.966481 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:55.966459 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc034d95c_ec37_4478_a57c_8027d3a83be5.slice/crio-bdac1cc44da64f6adb0c9d49b3f41ae51fa9ee390b5c790a840aedb48d2908d7 WatchSource:0}: Error finding container bdac1cc44da64f6adb0c9d49b3f41ae51fa9ee390b5c790a840aedb48d2908d7: Status 404 returned error can't find the container with id bdac1cc44da64f6adb0c9d49b3f41ae51fa9ee390b5c790a840aedb48d2908d7 Apr 24 21:16:56.023223 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.023116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm7sm\" (UniqueName: \"kubernetes.io/projected/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-kube-api-access-wm7sm\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.023409 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.023220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-tmp-dir\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.023409 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.023260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-hosts-file\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.124788 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.124723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-hosts-file\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.124997 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.124953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:56.124997 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.124970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-hosts-file\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.125135 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.125116 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:56.125185 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.125140 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6948fb494f-4hnkq: secret "image-registry-tls" not found Apr 24 21:16:56.125228 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.125204 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls podName:afad5a5d-bb00-427a-ad3b-6ad4a899dd16 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.125182551 +0000 UTC m=+35.192788128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls") pod "image-registry-6948fb494f-4hnkq" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16") : secret "image-registry-tls" not found Apr 24 21:16:56.125718 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.125122 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:56.125877 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.125859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wm7sm\" (UniqueName: \"kubernetes.io/projected/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-kube-api-access-wm7sm\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.125967 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.125920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-tmp-dir\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.126425 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.126401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-tmp-dir\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.126425 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.126420 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:56.126577 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.126502 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls podName:fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.126484095 +0000 UTC m=+35.194089678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bhp8b" (UID: "fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:56.144456 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.144418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm7sm\" (UniqueName: \"kubernetes.io/projected/42a9166c-4a9e-42ec-8194-8b7b3852c9a2-kube-api-access-wm7sm\") pod \"node-resolver-hdlfg\" (UID: \"42a9166c-4a9e-42ec-8194-8b7b3852c9a2\") " pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.178811 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.177106 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w"] Apr 24 21:16:56.191974 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.191920 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hdlfg" Apr 24 21:16:56.199808 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:56.199777 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42a9166c_4a9e_42ec_8194_8b7b3852c9a2.slice/crio-256ca18f203d6332b58a9848ef117b43b0719dabc76cd1379428f791356f71ce WatchSource:0}: Error finding container 256ca18f203d6332b58a9848ef117b43b0719dabc76cd1379428f791356f71ce: Status 404 returned error can't find the container with id 256ca18f203d6332b58a9848ef117b43b0719dabc76cd1379428f791356f71ce Apr 24 21:16:56.206067 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.206040 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8"] Apr 24 21:16:56.208994 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:56.208956 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3a09e1e_8c44_4778_a4f7_c54bd1cb4171.slice/crio-c582208956deca0babeada77cfd0c4372ad0a7cc1c0892f8eb6982f96add7ef8 WatchSource:0}: Error finding container c582208956deca0babeada77cfd0c4372ad0a7cc1c0892f8eb6982f96add7ef8: Status 404 returned error can't find the container with id c582208956deca0babeada77cfd0c4372ad0a7cc1c0892f8eb6982f96add7ef8 Apr 24 21:16:56.226698 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.226658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:56.226698 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.226702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:56.226897 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.226839 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.226817451 +0000 UTC m=+35.294423011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:56.226897 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.226869 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:56.226997 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.226914 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.226901619 +0000 UTC m=+35.294507176 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : secret "router-metrics-certs-default" not found Apr 24 21:16:56.226997 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.226871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:56.226997 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.226946 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:56.226997 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.226994 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls podName:7e42dfc6-9944-4e7e-a1d7-656b2871ff67 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.226978649 +0000 UTC m=+35.294584214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lx5kc" (UID: "7e42dfc6-9944-4e7e-a1d7-656b2871ff67") : secret "samples-operator-tls" not found Apr 24 21:16:56.227133 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.227072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:56.227133 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.227095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:56.227198 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.227157 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:56.227198 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.227184 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert podName:7321b947-9a04-45eb-a042-a216d960cbb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.227176764 +0000 UTC m=+35.294782320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert") pod "ingress-canary-mxfwh" (UID: "7321b947-9a04-45eb-a042-a216d960cbb7") : secret "canary-serving-cert" not found Apr 24 21:16:56.227270 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.227215 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:56.227313 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.227272 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs podName:bfa91d92-6b3a-44d1-8b22-abae9ded2a1c nodeName:}" failed. No retries permitted until 2026-04-24 21:17:28.227257575 +0000 UTC m=+66.294863141 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs") pod "network-metrics-daemon-4xbxm" (UID: "bfa91d92-6b3a-44d1-8b22-abae9ded2a1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:56.232595 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.232543 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x"] Apr 24 21:16:56.235904 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:56.235875 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod940bb44f_905e_43d7_a9d4_7cb49ef94c0e.slice/crio-0ea62ce7669f07519ef7b4ba355da3f9521fb0300677d39b99da4dd0f2e09e16 WatchSource:0}: Error finding container 0ea62ce7669f07519ef7b4ba355da3f9521fb0300677d39b99da4dd0f2e09e16: Status 404 returned error can't find the container with id 0ea62ce7669f07519ef7b4ba355da3f9521fb0300677d39b99da4dd0f2e09e16 Apr 24 21:16:56.273117 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.272877 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4"] Apr 24 21:16:56.328068 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.328039 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:56.328252 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.328233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:56.328316 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.328233 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:56.328364 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:56.328339 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls podName:dd6441e0-b9f0-483a-95f8-56bff0d86e71 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.328322522 +0000 UTC m=+35.395928089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls") pod "dns-default-v4ffd" (UID: "dd6441e0-b9f0-483a-95f8-56bff0d86e71") : secret "dns-default-metrics-tls" not found Apr 24 21:16:56.332149 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.332122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9r5w\" (UniqueName: \"kubernetes.io/projected/763a8f03-407c-4bd4-b683-27ba0614f163-kube-api-access-x9r5w\") pod \"network-check-target-pnbsk\" (UID: \"763a8f03-407c-4bd4-b683-27ba0614f163\") " pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:56.523785 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.523132 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:56.523785 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.523238 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:16:56.523785 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.523537 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:56.527497 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.527428 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:16:56.527684 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.527653 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6pwcs\"" Apr 24 21:16:56.527962 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.527772 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gjqgx\"" Apr 24 21:16:56.528333 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.528272 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:16:56.597753 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.597723 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:16:56.705393 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.705351 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" event={"ID":"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171","Type":"ContainerStarted","Data":"c582208956deca0babeada77cfd0c4372ad0a7cc1c0892f8eb6982f96add7ef8"} Apr 24 21:16:56.718225 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.717209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hdlfg" event={"ID":"42a9166c-4a9e-42ec-8194-8b7b3852c9a2","Type":"ContainerStarted","Data":"ba787fc23f75cd57b88b0a09c1fb593fcb476fc135e69296c40820c87d07113f"} Apr 24 21:16:56.718225 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.717260 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hdlfg" event={"ID":"42a9166c-4a9e-42ec-8194-8b7b3852c9a2","Type":"ContainerStarted","Data":"256ca18f203d6332b58a9848ef117b43b0719dabc76cd1379428f791356f71ce"} Apr 24 21:16:56.720496 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.720410 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bm7ns" event={"ID":"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8","Type":"ContainerStarted","Data":"4502d8e8841be2319726d566dfcceef41b175fd6d686b04a291ba1869c87edb8"} Apr 24 21:16:56.725037 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.724901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9" event={"ID":"b7e9209d-9d38-4fb0-a8ad-003c896fe276","Type":"ContainerStarted","Data":"914096b80a2e8ef95c99f7c938cd3f920ad18f48dd21b16f72dbf7bcab215afa"} Apr 24 21:16:56.735344 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.735180 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" event={"ID":"11dd689a-4bd3-402d-99a1-0a89fed0b025","Type":"ContainerStarted","Data":"8508117722a3b27cd3dcd1ba24aa66691b41b48ba151b847bde4dfae59065a5c"} Apr 24 21:16:56.739636 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.739590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" event={"ID":"5e304870-95cc-4401-8456-8388b7b9d759","Type":"ContainerStarted","Data":"70916dc2e10bafae8d83a450e6ec83e27eabda3cdafc571696789848574bc2f0"} Apr 24 21:16:56.751057 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.750701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" event={"ID":"940bb44f-905e-43d7-a9d4-7cb49ef94c0e","Type":"ContainerStarted","Data":"0ea62ce7669f07519ef7b4ba355da3f9521fb0300677d39b99da4dd0f2e09e16"} Apr 24 21:16:56.753903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.753829 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w" event={"ID":"8408d184-7b3a-4075-9db8-5cb9fae0821c","Type":"ContainerStarted","Data":"5b9fdd71dd45958dc30eeb3fef62983bbaeb8fd5bd4caf619f9db97ac5d27d65"} Apr 24 21:16:56.759746 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.759695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" event={"ID":"c034d95c-ec37-4478-a57c-8027d3a83be5","Type":"ContainerStarted","Data":"bdac1cc44da64f6adb0c9d49b3f41ae51fa9ee390b5c790a840aedb48d2908d7"} Apr 24 21:16:56.768975 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.768905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" event={"ID":"158622c6-3d53-4f0c-bd64-00a4f1f32d69","Type":"ContainerStarted","Data":"c97efa994a445a2ccaa8508cba56e9d58678a70a37813fbf3d75cbaed198db2c"} Apr 24 21:16:56.776714 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.776624 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1b4f3db-7875-478c-93b8-7c3155edd974" containerID="c6c096fe4f94a0df8ab7d6abd179c5d25da8d8056b93bf779d21ebb056f05c2d" exitCode=0 Apr 24 21:16:56.776714 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.776689 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" event={"ID":"d1b4f3db-7875-478c-93b8-7c3155edd974","Type":"ContainerDied","Data":"c6c096fe4f94a0df8ab7d6abd179c5d25da8d8056b93bf779d21ebb056f05c2d"} Apr 24 21:16:56.803068 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.802853 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hdlfg" podStartSLOduration=1.802832827 podStartE2EDuration="1.802832827s" podCreationTimestamp="2026-04-24 21:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:56.738181529 +0000 UTC m=+34.805787121" watchObservedRunningTime="2026-04-24 21:16:56.802832827 +0000 UTC m=+34.870438406" Apr 24 21:16:56.803068 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:56.803027 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pnbsk"] Apr 24 21:16:56.810871 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:16:56.810503 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod763a8f03_407c_4bd4_b683_27ba0614f163.slice/crio-d5ba49dca08a414e3cbe6eee02e69150c908acaf8bb97d8c1760ebd1ea050f7d WatchSource:0}: Error finding container d5ba49dca08a414e3cbe6eee02e69150c908acaf8bb97d8c1760ebd1ea050f7d: Status 404 returned error can't find the container with id d5ba49dca08a414e3cbe6eee02e69150c908acaf8bb97d8c1760ebd1ea050f7d Apr 24 21:16:57.136276 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.136033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:57.136495 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.136302 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:57.136495 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.136330 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6948fb494f-4hnkq: secret "image-registry-tls" not found Apr 24 21:16:57.136495 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.136341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:57.136495 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.136399 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls podName:afad5a5d-bb00-427a-ad3b-6ad4a899dd16 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.136378514 +0000 UTC m=+37.203984076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls") pod "image-registry-6948fb494f-4hnkq" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16") : secret "image-registry-tls" not found Apr 24 21:16:57.136714 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.136596 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:57.136714 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.136647 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls podName:fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.136633799 +0000 UTC m=+37.204239359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bhp8b" (UID: "fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.237961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.238094 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.238200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.238304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.238465 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.238533 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls podName:7e42dfc6-9944-4e7e-a1d7-656b2871ff67 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.23851394 +0000 UTC m=+37.306119512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lx5kc" (UID: "7e42dfc6-9944-4e7e-a1d7-656b2871ff67") : secret "samples-operator-tls" not found Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.239134 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.239187 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert podName:7321b947-9a04-45eb-a042-a216d960cbb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.239171189 +0000 UTC m=+37.306776754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert") pod "ingress-canary-mxfwh" (UID: "7321b947-9a04-45eb-a042-a216d960cbb7") : secret "canary-serving-cert" not found Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.239397 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.239385201 +0000 UTC m=+37.306990762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.239456 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:57.239573 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.239541 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.239529282 +0000 UTC m=+37.307134846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : secret "router-metrics-certs-default" not found Apr 24 21:16:57.340598 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.339665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:57.340598 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.340027 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:57.340598 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:57.340095 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls podName:dd6441e0-b9f0-483a-95f8-56bff0d86e71 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.340074386 +0000 UTC m=+37.407679948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls") pod "dns-default-v4ffd" (UID: "dd6441e0-b9f0-483a-95f8-56bff0d86e71") : secret "dns-default-metrics-tls" not found Apr 24 21:16:57.797723 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.797676 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pnbsk" event={"ID":"763a8f03-407c-4bd4-b683-27ba0614f163","Type":"ContainerStarted","Data":"d5ba49dca08a414e3cbe6eee02e69150c908acaf8bb97d8c1760ebd1ea050f7d"} Apr 24 21:16:57.846271 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.845827 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1b4f3db-7875-478c-93b8-7c3155edd974" containerID="f63f83e9229df03a3fcbdd0d472845e155b5aef81aabbfd6d6faaf03687eb29b" exitCode=0 Apr 24 21:16:57.846271 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:57.845914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" event={"ID":"d1b4f3db-7875-478c-93b8-7c3155edd974","Type":"ContainerDied","Data":"f63f83e9229df03a3fcbdd0d472845e155b5aef81aabbfd6d6faaf03687eb29b"} Apr 24 21:16:58.049972 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:58.049116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:58.058600 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:58.058525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e57ddd4a-8801-45ab-b463-3398cdeea471-original-pull-secret\") pod \"global-pull-secret-syncer-c7xmt\" (UID: \"e57ddd4a-8801-45ab-b463-3398cdeea471\") " pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:58.120389 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:58.119952 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c7xmt" Apr 24 21:16:59.162462 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:59.161486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:16:59.162462 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:59.161571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:16:59.162462 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.161719 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:59.162462 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.161787 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls podName:fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.161765516 +0000 UTC m=+41.229371092 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bhp8b" (UID: "fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:59.162462 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.162286 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:59.162462 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.162305 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6948fb494f-4hnkq: secret "image-registry-tls" not found Apr 24 21:16:59.162462 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.162402 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls podName:afad5a5d-bb00-427a-ad3b-6ad4a899dd16 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.162385134 +0000 UTC m=+41.229990696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls") pod "image-registry-6948fb494f-4hnkq" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16") : secret "image-registry-tls" not found Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:59.262519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:59.262623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:59.262655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:59.262756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.262939 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.263009 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls podName:7e42dfc6-9944-4e7e-a1d7-656b2871ff67 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.262986767 +0000 UTC m=+41.330592327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lx5kc" (UID: "7e42dfc6-9944-4e7e-a1d7-656b2871ff67") : secret "samples-operator-tls" not found Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.263500 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.263563 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert podName:7321b947-9a04-45eb-a042-a216d960cbb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.263537148 +0000 UTC m=+41.331142717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert") pod "ingress-canary-mxfwh" (UID: "7321b947-9a04-45eb-a042-a216d960cbb7") : secret "canary-serving-cert" not found Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.263637 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.263626629 +0000 UTC m=+41.331232189 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.263692 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:59.263774 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.263744 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.263733583 +0000 UTC m=+41.331339153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : secret "router-metrics-certs-default" not found Apr 24 21:16:59.364817 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:16:59.364169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:16:59.364817 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.364382 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:59.364817 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:16:59.364451 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls podName:dd6441e0-b9f0-483a-95f8-56bff0d86e71 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:03.364430971 +0000 UTC m=+41.432036574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls") pod "dns-default-v4ffd" (UID: "dd6441e0-b9f0-483a-95f8-56bff0d86e71") : secret "dns-default-metrics-tls" not found Apr 24 21:17:03.205351 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:03.205294 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:17:03.205975 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.205465 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:03.205975 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:03.205482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:17:03.205975 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.205548 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls podName:fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.205524883 +0000 UTC m=+49.273130443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bhp8b" (UID: "fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:03.205975 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.205590 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:03.205975 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.205609 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6948fb494f-4hnkq: secret "image-registry-tls" not found Apr 24 21:17:03.205975 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.205660 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls podName:afad5a5d-bb00-427a-ad3b-6ad4a899dd16 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.20564431 +0000 UTC m=+49.273249867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls") pod "image-registry-6948fb494f-4hnkq" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16") : secret "image-registry-tls" not found Apr 24 21:17:03.306024 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:03.305985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:17:03.306238 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:03.306082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:17:03.306238 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:03.306103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:03.306238 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.306114 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:17:03.306238 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:03.306121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:03.306238 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.306179 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:17:03.306238 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.306199 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls podName:7e42dfc6-9944-4e7e-a1d7-656b2871ff67 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.306178945 +0000 UTC m=+49.373784516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lx5kc" (UID: "7e42dfc6-9944-4e7e-a1d7-656b2871ff67") : secret "samples-operator-tls" not found Apr 24 21:17:03.306471 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.306239 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:03.306471 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.306254 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.306235102 +0000 UTC m=+49.373840660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : secret "router-metrics-certs-default" not found Apr 24 21:17:03.306471 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.306274 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.306265956 +0000 UTC m=+49.373871515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:03.306471 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.306290 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert podName:7321b947-9a04-45eb-a042-a216d960cbb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.306282212 +0000 UTC m=+49.373887781 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert") pod "ingress-canary-mxfwh" (UID: "7321b947-9a04-45eb-a042-a216d960cbb7") : secret "canary-serving-cert" not found Apr 24 21:17:03.407506 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:03.407473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:17:03.407716 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.407633 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:03.407716 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:03.407693 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls podName:dd6441e0-b9f0-483a-95f8-56bff0d86e71 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:11.407676388 +0000 UTC m=+49.475281956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls") pod "dns-default-v4ffd" (UID: "dd6441e0-b9f0-483a-95f8-56bff0d86e71") : secret "dns-default-metrics-tls" not found Apr 24 21:17:04.839373 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:04.839340 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c7xmt"] Apr 24 21:17:06.023975 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:06.023915 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode57ddd4a_8801_45ab_b463_3398cdeea471.slice/crio-ffbb7e0e47d7c8eaa3add2b96041d5b6d4a9f1302e063acace6c28eb675862da WatchSource:0}: Error finding container ffbb7e0e47d7c8eaa3add2b96041d5b6d4a9f1302e063acace6c28eb675862da: Status 404 returned error can't find the container with id ffbb7e0e47d7c8eaa3add2b96041d5b6d4a9f1302e063acace6c28eb675862da Apr 24 21:17:06.879289 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:06.879250 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c7xmt" event={"ID":"e57ddd4a-8801-45ab-b463-3398cdeea471","Type":"ContainerStarted","Data":"ffbb7e0e47d7c8eaa3add2b96041d5b6d4a9f1302e063acace6c28eb675862da"} Apr 24 21:17:08.892468 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:08.892416 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" event={"ID":"d1b4f3db-7875-478c-93b8-7c3155edd974","Type":"ContainerStarted","Data":"ce3cdcd7f5fdb6777efd51d864219e62b6a019601e973358bf61764427e5fe09"} Apr 24 21:17:08.982841 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:08.981692 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wpmrx" podStartSLOduration=16.05780666 podStartE2EDuration="46.981672412s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.085391097 +0000 UTC m=+3.152996654" lastFinishedPulling="2026-04-24 21:16:56.009256849 +0000 UTC m=+34.076862406" observedRunningTime="2026-04-24 21:17:08.98157845 +0000 UTC m=+47.049184029" watchObservedRunningTime="2026-04-24 21:17:08.981672412 +0000 UTC m=+47.049277996" Apr 24 21:17:09.147364 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.146961 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6"] Apr 24 21:17:09.170680 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.169674 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:09.170680 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.170561 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6"] Apr 24 21:17:09.183035 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.174954 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:17:09.183035 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.181769 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6z8tz\"" Apr 24 21:17:09.183035 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.182090 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:17:09.266304 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.266268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e58045bb-0010-4145-94ce-dd45fc5b114f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:09.266497 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.266324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:09.367155 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.367113 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e58045bb-0010-4145-94ce-dd45fc5b114f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:09.367349 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.367188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:09.368194 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.368123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e58045bb-0010-4145-94ce-dd45fc5b114f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:09.368615 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:09.368441 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:17:09.368615 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:09.368516 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert podName:e58045bb-0010-4145-94ce-dd45fc5b114f nodeName:}" failed. No retries permitted until 2026-04-24 21:17:09.868496064 +0000 UTC m=+47.936101624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jq9c6" (UID: "e58045bb-0010-4145-94ce-dd45fc5b114f") : secret "networking-console-plugin-cert" not found Apr 24 21:17:09.873464 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.873407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:09.873650 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:09.873633 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:17:09.873708 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:09.873689 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert podName:e58045bb-0010-4145-94ce-dd45fc5b114f nodeName:}" failed. No retries permitted until 2026-04-24 21:17:10.873673685 +0000 UTC m=+48.941279241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jq9c6" (UID: "e58045bb-0010-4145-94ce-dd45fc5b114f") : secret "networking-console-plugin-cert" not found Apr 24 21:17:09.908627 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.908586 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" event={"ID":"5e304870-95cc-4401-8456-8388b7b9d759","Type":"ContainerStarted","Data":"5c58c2f96c63198bdebe50b7a777976140ed76aab10b123cf27ae795ad09f4de"} Apr 24 21:17:09.913440 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.910032 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" event={"ID":"940bb44f-905e-43d7-a9d4-7cb49ef94c0e","Type":"ContainerStarted","Data":"c9b3d553c396b36fc155c1feeba8391dc3cbfa1151cc5acb11da3aa4e19bb695"} Apr 24 21:17:09.916759 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.916179 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pnbsk" event={"ID":"763a8f03-407c-4bd4-b683-27ba0614f163","Type":"ContainerStarted","Data":"e98fb8744ac82a7254cd18314f19861f9e684811fd116370787b6c1e56501bd6"} Apr 24 21:17:09.916759 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.916724 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:17:09.920413 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.919221 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w" event={"ID":"8408d184-7b3a-4075-9db8-5cb9fae0821c","Type":"ContainerStarted","Data":"669dbafb9aaa32b4bb38901f6a713cd9a84e4663b7ae191e445716f7e6c9a32e"} Apr 24 21:17:09.922704 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.922615 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" event={"ID":"c034d95c-ec37-4478-a57c-8027d3a83be5","Type":"ContainerStarted","Data":"a540414efc0cd99d296ea9fea9316dbfa30c9586bb9726425672beb99ccaf119"} Apr 24 21:17:09.925656 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.925630 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/0.log" Apr 24 21:17:09.925804 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.925705 2578 generic.go:358] "Generic (PLEG): container finished" podID="158622c6-3d53-4f0c-bd64-00a4f1f32d69" containerID="1228e7d692d3714af73aa02c5a52ecc1184505dbce36c419d133c7bb3837fe09" exitCode=255 Apr 24 21:17:09.925804 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.925793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" event={"ID":"158622c6-3d53-4f0c-bd64-00a4f1f32d69","Type":"ContainerDied","Data":"1228e7d692d3714af73aa02c5a52ecc1184505dbce36c419d133c7bb3837fe09"} Apr 24 21:17:09.926205 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.926186 2578 scope.go:117] "RemoveContainer" containerID="1228e7d692d3714af73aa02c5a52ecc1184505dbce36c419d133c7bb3837fe09" Apr 24 21:17:09.930030 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.929987 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" event={"ID":"b3a09e1e-8c44-4778-a4f7-c54bd1cb4171","Type":"ContainerStarted","Data":"a14b9c84a78cf9cba2900525ad7644f5935a87168e32ee73fee781eecacbf670"} Apr 24 21:17:09.930452 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.930337 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:17:09.933174 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.932569 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" Apr 24 21:17:09.934807 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.934745 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" podStartSLOduration=21.481710121 podStartE2EDuration="33.934727043s" podCreationTimestamp="2026-04-24 21:16:36 +0000 UTC" firstStartedPulling="2026-04-24 21:16:56.238018548 +0000 UTC m=+34.305624105" lastFinishedPulling="2026-04-24 21:17:08.691035448 +0000 UTC m=+46.758641027" observedRunningTime="2026-04-24 21:17:09.934016253 +0000 UTC m=+48.001621832" watchObservedRunningTime="2026-04-24 21:17:09.934727043 +0000 UTC m=+48.002332625" Apr 24 21:17:09.936955 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.935869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bm7ns" event={"ID":"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8","Type":"ContainerStarted","Data":"e09d262941f80c24d042e1b8febde369f9c198cfd873ad6593edf14a37c3f88f"} Apr 24 21:17:09.940124 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.939600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9" event={"ID":"b7e9209d-9d38-4fb0-a8ad-003c896fe276","Type":"ContainerStarted","Data":"b807588925e975004356d0007c2a25ee9543f11ccd493889c8e4cbc0349019e9"} Apr 24 21:17:09.944002 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.943761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" event={"ID":"11dd689a-4bd3-402d-99a1-0a89fed0b025","Type":"ContainerStarted","Data":"25830fce8e1b4006ec4d0df666c57fb0a537020ffdc4312b2f939c95ce151d74"} Apr 24 21:17:09.955324 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.954018 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pnbsk" podStartSLOduration=36.003507535 podStartE2EDuration="47.953996348s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:16:56.814772436 +0000 UTC m=+34.882378000" lastFinishedPulling="2026-04-24 21:17:08.765261245 +0000 UTC m=+46.832866813" observedRunningTime="2026-04-24 21:17:09.952901979 +0000 UTC m=+48.020507559" watchObservedRunningTime="2026-04-24 21:17:09.953996348 +0000 UTC m=+48.021601929" Apr 24 21:17:09.984417 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:09.984373 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2hr4w" podStartSLOduration=18.403137894 podStartE2EDuration="30.984358327s" podCreationTimestamp="2026-04-24 21:16:39 +0000 UTC" firstStartedPulling="2026-04-24 21:16:56.182696435 +0000 UTC m=+34.250302008" lastFinishedPulling="2026-04-24 21:17:08.763916867 +0000 UTC m=+46.831522441" observedRunningTime="2026-04-24 21:17:09.983774427 +0000 UTC m=+48.051380006" watchObservedRunningTime="2026-04-24 21:17:09.984358327 +0000 UTC m=+48.051963905" Apr 24 21:17:10.035075 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.034068 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764588ff59-qsth8" podStartSLOduration=32.326261711 podStartE2EDuration="45.034048651s" podCreationTimestamp="2026-04-24 21:16:25 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.982653148 +0000 UTC m=+34.050258711" lastFinishedPulling="2026-04-24 21:17:08.690440094 +0000 UTC m=+46.758045651" observedRunningTime="2026-04-24 21:17:10.033697407 +0000 UTC m=+48.101302990" watchObservedRunningTime="2026-04-24 21:17:10.034048651 +0000 UTC m=+48.101654231" Apr 24 21:17:10.052668 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.052378 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85c4b8b497-2tnb8" podStartSLOduration=32.497684081 podStartE2EDuration="45.052360585s" podCreationTimestamp="2026-04-24 21:16:25 +0000 UTC" firstStartedPulling="2026-04-24 21:16:56.210853928 +0000 UTC m=+34.278459499" lastFinishedPulling="2026-04-24 21:17:08.765530432 +0000 UTC m=+46.833136003" observedRunningTime="2026-04-24 21:17:10.050550097 +0000 UTC m=+48.118155673" watchObservedRunningTime="2026-04-24 21:17:10.052360585 +0000 UTC m=+48.119966162" Apr 24 21:17:10.094506 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.094436 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" podStartSLOduration=22.386274993 podStartE2EDuration="35.094412871s" podCreationTimestamp="2026-04-24 21:16:35 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.982606227 +0000 UTC m=+34.050211797" lastFinishedPulling="2026-04-24 21:17:08.690744111 +0000 UTC m=+46.758349675" observedRunningTime="2026-04-24 21:17:10.070007783 +0000 UTC m=+48.137613374" watchObservedRunningTime="2026-04-24 21:17:10.094412871 +0000 UTC m=+48.162018451" Apr 24 21:17:10.094688 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.094561 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6l5v9" podStartSLOduration=22.385551074 podStartE2EDuration="35.094553267s" podCreationTimestamp="2026-04-24 21:16:35 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.981907645 +0000 UTC m=+34.049513214" lastFinishedPulling="2026-04-24 21:17:08.690909849 +0000 UTC m=+46.758515407" observedRunningTime="2026-04-24 21:17:10.092586476 +0000 UTC m=+48.160192067" watchObservedRunningTime="2026-04-24 21:17:10.094553267 +0000 UTC m=+48.162158845" Apr 24 21:17:10.115343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.113878 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-bm7ns" podStartSLOduration=22.335852865 podStartE2EDuration="35.113854555s" podCreationTimestamp="2026-04-24 21:16:35 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.982589611 +0000 UTC m=+34.050195172" lastFinishedPulling="2026-04-24 21:17:08.76059129 +0000 UTC m=+46.828196862" observedRunningTime="2026-04-24 21:17:10.113480037 +0000 UTC m=+48.181085617" watchObservedRunningTime="2026-04-24 21:17:10.113854555 +0000 UTC m=+48.181460135" Apr 24 21:17:10.887496 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.887384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:10.887692 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:10.887541 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:17:10.887692 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:10.887617 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert podName:e58045bb-0010-4145-94ce-dd45fc5b114f nodeName:}" failed. No retries permitted until 2026-04-24 21:17:12.887596714 +0000 UTC m=+50.955202281 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jq9c6" (UID: "e58045bb-0010-4145-94ce-dd45fc5b114f") : secret "networking-console-plugin-cert" not found Apr 24 21:17:10.936944 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.936332 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hdlfg_42a9166c-4a9e-42ec-8194-8b7b3852c9a2/dns-node-resolver/0.log" Apr 24 21:17:10.948944 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.948831 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:17:10.949952 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.949265 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/0.log" Apr 24 21:17:10.949952 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.949312 2578 generic.go:358] "Generic (PLEG): container finished" podID="158622c6-3d53-4f0c-bd64-00a4f1f32d69" containerID="bee80c754027a1b117984b1658c748d37b6d3ce6490d71255679572ec0844e2f" exitCode=255 Apr 24 21:17:10.949952 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.949439 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" event={"ID":"158622c6-3d53-4f0c-bd64-00a4f1f32d69","Type":"ContainerDied","Data":"bee80c754027a1b117984b1658c748d37b6d3ce6490d71255679572ec0844e2f"} Apr 24 21:17:10.949952 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.949476 2578 scope.go:117] "RemoveContainer" containerID="1228e7d692d3714af73aa02c5a52ecc1184505dbce36c419d133c7bb3837fe09" Apr 24 21:17:10.949952 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:10.949683 2578 scope.go:117] "RemoveContainer" containerID="bee80c754027a1b117984b1658c748d37b6d3ce6490d71255679572ec0844e2f" Apr 24 21:17:10.950209 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:10.949956 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wxcvf_openshift-console-operator(158622c6-3d53-4f0c-bd64-00a4f1f32d69)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" podUID="158622c6-3d53-4f0c-bd64-00a4f1f32d69" Apr 24 21:17:11.207109 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.207072 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs"] Apr 24 21:17:11.233231 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.233181 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs"] Apr 24 21:17:11.233399 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.233374 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" Apr 24 21:17:11.235943 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.235872 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:11.236100 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.235872 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:17:11.236215 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.236115 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5rzk5\"" Apr 24 21:17:11.293008 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.292973 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:17:11.293212 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.293034 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdhws\" (UniqueName: \"kubernetes.io/projected/be760740-4d3a-496f-91a2-cef6e63a178b-kube-api-access-tdhws\") pod \"migrator-74bb7799d9-2sqrs\" (UID: \"be760740-4d3a-496f-91a2-cef6e63a178b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" Apr 24 21:17:11.293212 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.293106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:17:11.293212 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.293134 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:11.293212 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.293152 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6948fb494f-4hnkq: secret "image-registry-tls" not found Apr 24 21:17:11.293212 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.293187 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:11.293457 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.293231 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls podName:fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.293212505 +0000 UTC m=+65.360818062 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bhp8b" (UID: "fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:11.293457 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.293306 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls podName:afad5a5d-bb00-427a-ad3b-6ad4a899dd16 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.293295801 +0000 UTC m=+65.360901359 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls") pod "image-registry-6948fb494f-4hnkq" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16") : secret "image-registry-tls" not found Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.393675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.393748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdhws\" (UniqueName: \"kubernetes.io/projected/be760740-4d3a-496f-91a2-cef6e63a178b-kube-api-access-tdhws\") pod \"migrator-74bb7799d9-2sqrs\" (UID: \"be760740-4d3a-496f-91a2-cef6e63a178b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.393854 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.393868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.393910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.393952 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls podName:7e42dfc6-9944-4e7e-a1d7-656b2871ff67 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.393908588 +0000 UTC m=+65.461514145 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lx5kc" (UID: "7e42dfc6-9944-4e7e-a1d7-656b2871ff67") : secret "samples-operator-tls" not found Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.393993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.394049 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.394017467 +0000 UTC m=+65.461623025 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : configmap references non-existent config key: service-ca.crt Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.394110 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.394155 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs podName:ca1e5038-3f4a-4f35-ac48-63b7f9fa576a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.394140662 +0000 UTC m=+65.461746218 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs") pod "router-default-5b7858ff9b-qlsj4" (UID: "ca1e5038-3f4a-4f35-ac48-63b7f9fa576a") : secret "router-metrics-certs-default" not found Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.394116 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:11.394368 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.394188 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert podName:7321b947-9a04-45eb-a042-a216d960cbb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.394180613 +0000 UTC m=+65.461786171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert") pod "ingress-canary-mxfwh" (UID: "7321b947-9a04-45eb-a042-a216d960cbb7") : secret "canary-serving-cert" not found Apr 24 21:17:11.403114 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.403079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdhws\" (UniqueName: \"kubernetes.io/projected/be760740-4d3a-496f-91a2-cef6e63a178b-kube-api-access-tdhws\") pod \"migrator-74bb7799d9-2sqrs\" (UID: \"be760740-4d3a-496f-91a2-cef6e63a178b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" Apr 24 21:17:11.495375 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.495283 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:17:11.495530 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.495460 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:11.495580 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.495539 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls podName:dd6441e0-b9f0-483a-95f8-56bff0d86e71 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.495517497 +0000 UTC m=+65.563123055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls") pod "dns-default-v4ffd" (UID: "dd6441e0-b9f0-483a-95f8-56bff0d86e71") : secret "dns-default-metrics-tls" not found Apr 24 21:17:11.546491 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.546451 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" Apr 24 21:17:11.621524 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.621488 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6n8rw"] Apr 24 21:17:11.638407 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.638369 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6n8rw"] Apr 24 21:17:11.638584 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.638532 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.641131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.640856 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:17:11.641131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.640872 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-26gnc\"" Apr 24 21:17:11.641131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.641097 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:17:11.697490 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.697450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-data-volume\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.697680 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.697499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.697680 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.697632 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-crio-socket\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.697810 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.697681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5h7j\" (UniqueName: \"kubernetes.io/projected/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-kube-api-access-r5h7j\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.697862 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.697808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.798558 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.798468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5h7j\" (UniqueName: \"kubernetes.io/projected/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-kube-api-access-r5h7j\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.798733 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.798559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.798801 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.798776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-data-volume\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.798858 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.798806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.798907 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.798894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-crio-socket\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.799304 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.799107 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:11.799304 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.799187 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls podName:04c1b2a9-55bb-42b1-8157-4885f8b97dc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:12.299165842 +0000 UTC m=+50.366771414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6n8rw" (UID: "04c1b2a9-55bb-42b1-8157-4885f8b97dc0") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:11.799304 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.799195 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-crio-socket\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.799304 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.799190 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-data-volume\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.799304 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.799211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.808370 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.808313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5h7j\" (UniqueName: \"kubernetes.io/projected/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-kube-api-access-r5h7j\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:11.936972 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.936940 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7p6s9_5f9a3a11-4bde-44e9-a68d-2d9ababf72d3/node-ca/0.log" Apr 24 21:17:11.959819 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.959784 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:17:11.960273 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:11.960243 2578 scope.go:117] "RemoveContainer" containerID="bee80c754027a1b117984b1658c748d37b6d3ce6490d71255679572ec0844e2f" Apr 24 21:17:11.960475 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:11.960451 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wxcvf_openshift-console-operator(158622c6-3d53-4f0c-bd64-00a4f1f32d69)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" podUID="158622c6-3d53-4f0c-bd64-00a4f1f32d69" Apr 24 21:17:12.303534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:12.303491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:12.303728 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:12.303646 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:12.303788 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:12.303731 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls podName:04c1b2a9-55bb-42b1-8157-4885f8b97dc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:13.303714938 +0000 UTC m=+51.371320507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6n8rw" (UID: "04c1b2a9-55bb-42b1-8157-4885f8b97dc0") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:12.908165 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:12.908114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:12.908372 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:12.908267 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:17:12.908372 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:12.908359 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert podName:e58045bb-0010-4145-94ce-dd45fc5b114f nodeName:}" failed. No retries permitted until 2026-04-24 21:17:16.908341603 +0000 UTC m=+54.975947162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jq9c6" (UID: "e58045bb-0010-4145-94ce-dd45fc5b114f") : secret "networking-console-plugin-cert" not found Apr 24 21:17:13.311279 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.311233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:13.311747 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:13.311383 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:13.311747 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:13.311466 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls podName:04c1b2a9-55bb-42b1-8157-4885f8b97dc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:15.311446804 +0000 UTC m=+53.379052370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6n8rw" (UID: "04c1b2a9-55bb-42b1-8157-4885f8b97dc0") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:13.741162 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.740289 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs"] Apr 24 21:17:13.747869 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:13.747841 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe760740_4d3a_496f_91a2_cef6e63a178b.slice/crio-d2b0b098ca3cfbfad97008276e4b06068f2a7bdd785d96ba7032a772362c349f WatchSource:0}: Error finding container d2b0b098ca3cfbfad97008276e4b06068f2a7bdd785d96ba7032a772362c349f: Status 404 returned error can't find the container with id d2b0b098ca3cfbfad97008276e4b06068f2a7bdd785d96ba7032a772362c349f Apr 24 21:17:13.794581 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.794549 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lk5xx"] Apr 24 21:17:13.797981 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.797957 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:13.800832 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.800797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:17:13.801177 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.801152 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:17:13.801438 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.801421 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:17:13.801714 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.801697 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:17:13.801852 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.801827 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-885p5\"" Apr 24 21:17:13.805297 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.805266 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lk5xx"] Apr 24 21:17:13.916466 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.916429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvlbl\" (UniqueName: \"kubernetes.io/projected/f4b7c74f-9f58-46b4-8294-22f10d98b170-kube-api-access-hvlbl\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:13.916691 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.916486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4b7c74f-9f58-46b4-8294-22f10d98b170-signing-key\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:13.916691 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.916602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4b7c74f-9f58-46b4-8294-22f10d98b170-signing-cabundle\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:13.968545 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.968505 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" event={"ID":"5e304870-95cc-4401-8456-8388b7b9d759","Type":"ContainerStarted","Data":"395a5e38728c7becbdd76381e070026a5dec3bd0f27036d871ddb0416aaad1dc"} Apr 24 21:17:13.968545 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.968545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" event={"ID":"5e304870-95cc-4401-8456-8388b7b9d759","Type":"ContainerStarted","Data":"d51281572220b28cf036e58c86076d7c35823830c69511db080765a621871658"} Apr 24 21:17:13.969567 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.969542 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" event={"ID":"be760740-4d3a-496f-91a2-cef6e63a178b","Type":"ContainerStarted","Data":"d2b0b098ca3cfbfad97008276e4b06068f2a7bdd785d96ba7032a772362c349f"} Apr 24 21:17:13.970706 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:13.970688 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c7xmt" event={"ID":"e57ddd4a-8801-45ab-b463-3398cdeea471","Type":"ContainerStarted","Data":"391704da341ded60b9526263b90b16862a79a74a8471466f5110fbd4b0d9a39e"} Apr 24 21:17:14.018222 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.018179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvlbl\" (UniqueName: \"kubernetes.io/projected/f4b7c74f-9f58-46b4-8294-22f10d98b170-kube-api-access-hvlbl\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:14.018418 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.018247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4b7c74f-9f58-46b4-8294-22f10d98b170-signing-key\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:14.018418 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.018321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4b7c74f-9f58-46b4-8294-22f10d98b170-signing-cabundle\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:14.018998 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.018952 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-c7xmt" podStartSLOduration=24.475193315 podStartE2EDuration="32.01891928s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:17:06.04401105 +0000 UTC m=+44.111616612" lastFinishedPulling="2026-04-24 21:17:13.587737003 +0000 UTC m=+51.655342577" observedRunningTime="2026-04-24 21:17:14.01806319 +0000 UTC m=+52.085668772" watchObservedRunningTime="2026-04-24 21:17:14.01891928 +0000 UTC m=+52.086524859" Apr 24 21:17:14.019130 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.019020 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" podStartSLOduration=31.704487123 podStartE2EDuration="49.01901644s" podCreationTimestamp="2026-04-24 21:16:25 +0000 UTC" firstStartedPulling="2026-04-24 21:16:56.279509724 +0000 UTC m=+34.347115296" lastFinishedPulling="2026-04-24 21:17:13.594039054 +0000 UTC m=+51.661644613" observedRunningTime="2026-04-24 21:17:13.996954839 +0000 UTC m=+52.064560423" watchObservedRunningTime="2026-04-24 21:17:14.01901644 +0000 UTC m=+52.086622020" Apr 24 21:17:14.019222 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.019193 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4b7c74f-9f58-46b4-8294-22f10d98b170-signing-cabundle\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:14.021086 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.021065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4b7c74f-9f58-46b4-8294-22f10d98b170-signing-key\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:14.027226 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.027176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvlbl\" (UniqueName: \"kubernetes.io/projected/f4b7c74f-9f58-46b4-8294-22f10d98b170-kube-api-access-hvlbl\") pod \"service-ca-865cb79987-lk5xx\" (UID: \"f4b7c74f-9f58-46b4-8294-22f10d98b170\") " pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:14.111553 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.111507 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-lk5xx" Apr 24 21:17:14.258684 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.258653 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lk5xx"] Apr 24 21:17:14.261446 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:14.261408 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b7c74f_9f58_46b4_8294_22f10d98b170.slice/crio-e03cf35d216f49636848a3021808c2f040d546a0f4446a5e3c38f809b1a8fce2 WatchSource:0}: Error finding container e03cf35d216f49636848a3021808c2f040d546a0f4446a5e3c38f809b1a8fce2: Status 404 returned error can't find the container with id e03cf35d216f49636848a3021808c2f040d546a0f4446a5e3c38f809b1a8fce2 Apr 24 21:17:14.974957 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.974898 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-lk5xx" event={"ID":"f4b7c74f-9f58-46b4-8294-22f10d98b170","Type":"ContainerStarted","Data":"98451af02588f7bb2e74a52c63ba20bdb3c4aaed8abd6f41227543a3d2024a67"} Apr 24 21:17:14.974957 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.974955 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-lk5xx" event={"ID":"f4b7c74f-9f58-46b4-8294-22f10d98b170","Type":"ContainerStarted","Data":"e03cf35d216f49636848a3021808c2f040d546a0f4446a5e3c38f809b1a8fce2"} Apr 24 21:17:14.976771 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.976743 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" event={"ID":"be760740-4d3a-496f-91a2-cef6e63a178b","Type":"ContainerStarted","Data":"77e507fb0b10d7c97457a6e42f09b1731aedfad52f9b6259f45349b6f117462d"} Apr 24 21:17:14.976771 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:14.976773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" event={"ID":"be760740-4d3a-496f-91a2-cef6e63a178b","Type":"ContainerStarted","Data":"bfccbf5a8ab72ee26bdae05b09c798e0c82cc6e27583fc2854b32611f87a09fc"} Apr 24 21:17:15.003520 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:15.003464 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-lk5xx" podStartSLOduration=2.003448766 podStartE2EDuration="2.003448766s" podCreationTimestamp="2026-04-24 21:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:15.001982872 +0000 UTC m=+53.069588448" watchObservedRunningTime="2026-04-24 21:17:15.003448766 +0000 UTC m=+53.071054344" Apr 24 21:17:15.021947 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:15.021819 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2sqrs" podStartSLOduration=3.015905583 podStartE2EDuration="4.021799166s" podCreationTimestamp="2026-04-24 21:17:11 +0000 UTC" firstStartedPulling="2026-04-24 21:17:13.750258761 +0000 UTC m=+51.817864317" lastFinishedPulling="2026-04-24 21:17:14.756152326 +0000 UTC m=+52.823757900" observedRunningTime="2026-04-24 21:17:15.021170705 +0000 UTC m=+53.088776285" watchObservedRunningTime="2026-04-24 21:17:15.021799166 +0000 UTC m=+53.089404746" Apr 24 21:17:15.332954 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:15.332836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:15.333124 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:15.333015 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:15.333124 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:15.333106 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls podName:04c1b2a9-55bb-42b1-8157-4885f8b97dc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:19.333090625 +0000 UTC m=+57.400696183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6n8rw" (UID: "04c1b2a9-55bb-42b1-8157-4885f8b97dc0") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:15.648483 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:15.648386 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:17:15.648483 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:15.648426 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:17:15.648937 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:15.648905 2578 scope.go:117] "RemoveContainer" containerID="bee80c754027a1b117984b1658c748d37b6d3ce6490d71255679572ec0844e2f" Apr 24 21:17:15.649191 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:15.649169 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wxcvf_openshift-console-operator(158622c6-3d53-4f0c-bd64-00a4f1f32d69)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" podUID="158622c6-3d53-4f0c-bd64-00a4f1f32d69" Apr 24 21:17:16.950515 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:16.950469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:16.950972 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:16.950627 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:17:16.950972 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:16.950697 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert podName:e58045bb-0010-4145-94ce-dd45fc5b114f nodeName:}" failed. No retries permitted until 2026-04-24 21:17:24.950679955 +0000 UTC m=+63.018285516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jq9c6" (UID: "e58045bb-0010-4145-94ce-dd45fc5b114f") : secret "networking-console-plugin-cert" not found Apr 24 21:17:19.375131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:19.375085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:19.375571 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:19.375241 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:19.375571 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:19.375313 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls podName:04c1b2a9-55bb-42b1-8157-4885f8b97dc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:27.375297241 +0000 UTC m=+65.442902798 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6n8rw" (UID: "04c1b2a9-55bb-42b1-8157-4885f8b97dc0") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:20.693394 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:20.693361 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqvlx" Apr 24 21:17:25.026155 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:25.026111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:25.028871 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:25.028844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e58045bb-0010-4145-94ce-dd45fc5b114f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jq9c6\" (UID: \"e58045bb-0010-4145-94ce-dd45fc5b114f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:25.102481 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:25.102448 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6z8tz\"" Apr 24 21:17:25.110527 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:25.110495 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" Apr 24 21:17:25.239395 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:25.239365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6"] Apr 24 21:17:25.255555 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:25.255514 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58045bb_0010_4145_94ce_dd45fc5b114f.slice/crio-14df0ab5f0db09629229c0c575d50c850b74d2d2341a139af05ea5269e8e99bc WatchSource:0}: Error finding container 14df0ab5f0db09629229c0c575d50c850b74d2d2341a139af05ea5269e8e99bc: Status 404 returned error can't find the container with id 14df0ab5f0db09629229c0c575d50c850b74d2d2341a139af05ea5269e8e99bc Apr 24 21:17:26.010054 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:26.010007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" event={"ID":"e58045bb-0010-4145-94ce-dd45fc5b114f","Type":"ContainerStarted","Data":"14df0ab5f0db09629229c0c575d50c850b74d2d2341a139af05ea5269e8e99bc"} Apr 24 21:17:27.013947 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.013890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" event={"ID":"e58045bb-0010-4145-94ce-dd45fc5b114f","Type":"ContainerStarted","Data":"4b368bbfecc28d335eab651df0eb604c76db256e05b50be06d536286333d07a2"} Apr 24 21:17:27.038011 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.037954 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jq9c6" podStartSLOduration=16.728689576 podStartE2EDuration="18.037914147s" podCreationTimestamp="2026-04-24 21:17:09 +0000 UTC" firstStartedPulling="2026-04-24 21:17:25.257531911 +0000 UTC m=+63.325137470" lastFinishedPulling="2026-04-24 21:17:26.566756471 +0000 UTC m=+64.634362041" observedRunningTime="2026-04-24 21:17:27.037267407 +0000 UTC m=+65.104872986" watchObservedRunningTime="2026-04-24 21:17:27.037914147 +0000 UTC m=+65.105519727" Apr 24 21:17:27.348733 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.348641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:17:27.348894 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.348785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:17:27.351343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.351314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"image-registry-6948fb494f-4hnkq\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:17:27.351343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.351338 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bhp8b\" (UID: \"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:17:27.449995 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.449952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:17:27.450167 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.450035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:27.450167 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.450066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:17:27.450167 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.450090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:27.450167 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.450107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:27.450796 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.450767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-service-ca-bundle\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:27.452461 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.452439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e42dfc6-9944-4e7e-a1d7-656b2871ff67-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lx5kc\" (UID: \"7e42dfc6-9944-4e7e-a1d7-656b2871ff67\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:17:27.452461 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.452457 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7321b947-9a04-45eb-a042-a216d960cbb7-cert\") pod \"ingress-canary-mxfwh\" (UID: \"7321b947-9a04-45eb-a042-a216d960cbb7\") " pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:17:27.452808 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.452786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04c1b2a9-55bb-42b1-8157-4885f8b97dc0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6n8rw\" (UID: \"04c1b2a9-55bb-42b1-8157-4885f8b97dc0\") " pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:27.452863 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.452849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca1e5038-3f4a-4f35-ac48-63b7f9fa576a-metrics-certs\") pod \"router-default-5b7858ff9b-qlsj4\" (UID: \"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a\") " pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:27.463801 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.463774 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dqkgc\"" Apr 24 21:17:27.471186 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.471155 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:17:27.521171 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.521140 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-wz2s5\"" Apr 24 21:17:27.529592 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.529255 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" Apr 24 21:17:27.550812 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.550720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:17:27.553360 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.553329 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-26gnc\"" Apr 24 21:17:27.553616 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.553591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd6441e0-b9f0-483a-95f8-56bff0d86e71-metrics-tls\") pod \"dns-default-v4ffd\" (UID: \"dd6441e0-b9f0-483a-95f8-56bff0d86e71\") " pod="openshift-dns/dns-default-v4ffd" Apr 24 21:17:27.561737 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.561687 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6n8rw" Apr 24 21:17:27.591561 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.591354 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-2cbt7\"" Apr 24 21:17:27.598326 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.597994 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" Apr 24 21:17:27.607222 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.606951 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6948fb494f-4hnkq"] Apr 24 21:17:27.607468 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.607384 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sbnww\"" Apr 24 21:17:27.609904 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:27.609857 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafad5a5d_bb00_427a_ad3b_6ad4a899dd16.slice/crio-df552912e815049c67525343bab51e45261a42ab4908261d5d6a635cbc6cbe5b WatchSource:0}: Error finding container df552912e815049c67525343bab51e45261a42ab4908261d5d6a635cbc6cbe5b: Status 404 returned error can't find the container with id df552912e815049c67525343bab51e45261a42ab4908261d5d6a635cbc6cbe5b Apr 24 21:17:27.615871 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.615781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:27.636702 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.636669 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6g56g\"" Apr 24 21:17:27.644615 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.644557 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mxfwh" Apr 24 21:17:27.706030 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.704325 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9vgbs\"" Apr 24 21:17:27.712265 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.712170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v4ffd" Apr 24 21:17:27.725695 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.725324 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b"] Apr 24 21:17:27.739390 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:27.739198 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe5f6e68_6c62_42ed_8fb3_c8b29635d5bb.slice/crio-23ccdbdcc828a07c2d4396bcedc033f12dde9538920ca3b5dd1c5fded3889256 WatchSource:0}: Error finding container 23ccdbdcc828a07c2d4396bcedc033f12dde9538920ca3b5dd1c5fded3889256: Status 404 returned error can't find the container with id 23ccdbdcc828a07c2d4396bcedc033f12dde9538920ca3b5dd1c5fded3889256 Apr 24 21:17:27.754788 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.751240 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6n8rw"] Apr 24 21:17:27.787400 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.786737 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc"] Apr 24 21:17:27.822108 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.822065 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b7858ff9b-qlsj4"] Apr 24 21:17:27.826992 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:27.826944 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca1e5038_3f4a_4f35_ac48_63b7f9fa576a.slice/crio-f41c356a2547ddde6e7f60986eaf30135cd9cf9eb15c4fd37f3ba3f8d077cae3 WatchSource:0}: Error finding container f41c356a2547ddde6e7f60986eaf30135cd9cf9eb15c4fd37f3ba3f8d077cae3: Status 404 returned error can't find the container with id f41c356a2547ddde6e7f60986eaf30135cd9cf9eb15c4fd37f3ba3f8d077cae3 Apr 24 21:17:27.851240 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.850884 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mxfwh"] Apr 24 21:17:27.854233 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:27.854174 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7321b947_9a04_45eb_a042_a216d960cbb7.slice/crio-09d374e3deb570001ee2dff56b103f90c814f3c013487ab486b8cf432e1a5159 WatchSource:0}: Error finding container 09d374e3deb570001ee2dff56b103f90c814f3c013487ab486b8cf432e1a5159: Status 404 returned error can't find the container with id 09d374e3deb570001ee2dff56b103f90c814f3c013487ab486b8cf432e1a5159 Apr 24 21:17:27.907904 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:27.907854 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v4ffd"] Apr 24 21:17:27.911824 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:27.911791 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd6441e0_b9f0_483a_95f8_56bff0d86e71.slice/crio-39e0c6bd798d54b9738157e85d170938dd7ef29c6c6b0eb70a3e052df455e4d3 WatchSource:0}: Error finding container 39e0c6bd798d54b9738157e85d170938dd7ef29c6c6b0eb70a3e052df455e4d3: Status 404 returned error can't find the container with id 39e0c6bd798d54b9738157e85d170938dd7ef29c6c6b0eb70a3e052df455e4d3 Apr 24 21:17:28.018396 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.018355 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" event={"ID":"afad5a5d-bb00-427a-ad3b-6ad4a899dd16","Type":"ContainerStarted","Data":"b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc"} Apr 24 21:17:28.018869 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.018403 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" event={"ID":"afad5a5d-bb00-427a-ad3b-6ad4a899dd16","Type":"ContainerStarted","Data":"df552912e815049c67525343bab51e45261a42ab4908261d5d6a635cbc6cbe5b"} Apr 24 21:17:28.018869 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.018560 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:17:28.019936 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.019899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" event={"ID":"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a","Type":"ContainerStarted","Data":"d50099e3d17e779a50e5fb704ad016ec92f30158c3602d00005397286f986804"} Apr 24 21:17:28.020063 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.019967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" event={"ID":"ca1e5038-3f4a-4f35-ac48-63b7f9fa576a","Type":"ContainerStarted","Data":"f41c356a2547ddde6e7f60986eaf30135cd9cf9eb15c4fd37f3ba3f8d077cae3"} Apr 24 21:17:28.021320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.021296 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6n8rw" event={"ID":"04c1b2a9-55bb-42b1-8157-4885f8b97dc0","Type":"ContainerStarted","Data":"53ca5fbcef4b0cb6efad8ae40bf99811836fb5e1cc3c9f3a9d6c0c1ebc35404f"} Apr 24 21:17:28.021421 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.021327 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6n8rw" event={"ID":"04c1b2a9-55bb-42b1-8157-4885f8b97dc0","Type":"ContainerStarted","Data":"7a2219cf0c76bab705c1b0e91e598d6c88753e72cece72f05ac4ebd3ce7700aa"} Apr 24 21:17:28.022434 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.022409 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v4ffd" event={"ID":"dd6441e0-b9f0-483a-95f8-56bff0d86e71","Type":"ContainerStarted","Data":"39e0c6bd798d54b9738157e85d170938dd7ef29c6c6b0eb70a3e052df455e4d3"} Apr 24 21:17:28.023533 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.023512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" event={"ID":"7e42dfc6-9944-4e7e-a1d7-656b2871ff67","Type":"ContainerStarted","Data":"a2902fa3b52a0f99378d8651e845345fb8a6aef88dcad9753756c3d04f5cac84"} Apr 24 21:17:28.024555 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.024532 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" event={"ID":"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb","Type":"ContainerStarted","Data":"23ccdbdcc828a07c2d4396bcedc033f12dde9538920ca3b5dd1c5fded3889256"} Apr 24 21:17:28.025656 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.025630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mxfwh" event={"ID":"7321b947-9a04-45eb-a042-a216d960cbb7","Type":"ContainerStarted","Data":"09d374e3deb570001ee2dff56b103f90c814f3c013487ab486b8cf432e1a5159"} Apr 24 21:17:28.122008 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.121886 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" podStartSLOduration=65.121866722 podStartE2EDuration="1m5.121866722s" podCreationTimestamp="2026-04-24 21:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:28.057843798 +0000 UTC m=+66.125449390" watchObservedRunningTime="2026-04-24 21:17:28.121866722 +0000 UTC m=+66.189472301" Apr 24 21:17:28.122480 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.122456 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" podStartSLOduration=52.122447014 podStartE2EDuration="52.122447014s" podCreationTimestamp="2026-04-24 21:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:28.120665127 +0000 UTC m=+66.188270717" watchObservedRunningTime="2026-04-24 21:17:28.122447014 +0000 UTC m=+66.190052593" Apr 24 21:17:28.260432 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.260380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:17:28.263062 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.263028 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:28.274309 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.274237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa91d92-6b3a-44d1-8b22-abae9ded2a1c-metrics-certs\") pod \"network-metrics-daemon-4xbxm\" (UID: \"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c\") " pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:17:28.413374 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.413084 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gjqgx\"" Apr 24 21:17:28.421639 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.421139 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xbxm" Apr 24 21:17:28.606649 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.606612 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4xbxm"] Apr 24 21:17:28.616432 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.616362 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:28.619615 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:28.619581 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:28.797717 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:28.797632 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa91d92_6b3a_44d1_8b22_abae9ded2a1c.slice/crio-2aab314a68244deeb2479e45a396de365e557f66a3c73151e9bb3911d2c53774 WatchSource:0}: Error finding container 2aab314a68244deeb2479e45a396de365e557f66a3c73151e9bb3911d2c53774: Status 404 returned error can't find the container with id 2aab314a68244deeb2479e45a396de365e557f66a3c73151e9bb3911d2c53774 Apr 24 21:17:29.033597 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:29.033531 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xbxm" event={"ID":"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c","Type":"ContainerStarted","Data":"2aab314a68244deeb2479e45a396de365e557f66a3c73151e9bb3911d2c53774"} Apr 24 21:17:29.034206 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:29.034062 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:29.035813 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:29.035559 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5b7858ff9b-qlsj4" Apr 24 21:17:29.524301 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:29.524272 2578 scope.go:117] "RemoveContainer" containerID="bee80c754027a1b117984b1658c748d37b6d3ce6490d71255679572ec0844e2f" Apr 24 21:17:33.054131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.053187 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v4ffd" event={"ID":"dd6441e0-b9f0-483a-95f8-56bff0d86e71","Type":"ContainerStarted","Data":"83235a19b100ce5d9f5aae894e4e310285807170ceb2c03a0463ba5b44cb0425"} Apr 24 21:17:33.054131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.053242 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v4ffd" event={"ID":"dd6441e0-b9f0-483a-95f8-56bff0d86e71","Type":"ContainerStarted","Data":"cbf7b67ae8495e5e751e72881e1411cd6e5592bce73f6183914c55197bac4aff"} Apr 24 21:17:33.054131 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.054087 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-v4ffd" Apr 24 21:17:33.057967 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.057348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" event={"ID":"7e42dfc6-9944-4e7e-a1d7-656b2871ff67","Type":"ContainerStarted","Data":"215717c7848f88a206a5b11d75207311f6abfc31f843159f83eecbac71090ae2"} Apr 24 21:17:33.057967 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.057385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" event={"ID":"7e42dfc6-9944-4e7e-a1d7-656b2871ff67","Type":"ContainerStarted","Data":"23c315d830a2087285a16f972cdb9d103ef86d21067576c22354f224ff4638b9"} Apr 24 21:17:33.061961 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.061047 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:17:33.061961 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.061138 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" event={"ID":"158622c6-3d53-4f0c-bd64-00a4f1f32d69","Type":"ContainerStarted","Data":"ff4d3aee0dace74e4de71452609670286aad7c8fc23c32a4821a34476f663572"} Apr 24 21:17:33.062275 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.062204 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:17:33.065454 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.065420 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" event={"ID":"fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb","Type":"ContainerStarted","Data":"b3a5381cb1b086ac01bd2edf7a123426ec75ee83278e4806342b9c560ae4b367"} Apr 24 21:17:33.069339 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.069314 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" Apr 24 21:17:33.071299 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.070671 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mxfwh" event={"ID":"7321b947-9a04-45eb-a042-a216d960cbb7","Type":"ContainerStarted","Data":"bad2d109aa489ee8340cfe2dcf9a363e9440e30d2357e2524fbb65a657534709"} Apr 24 21:17:33.074342 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.074282 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6n8rw" event={"ID":"04c1b2a9-55bb-42b1-8157-4885f8b97dc0","Type":"ContainerStarted","Data":"e093a09d773e6ed5871b356676443c5a7979a9e500fee7cdb3b661add51801d2"} Apr 24 21:17:33.078627 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.078578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xbxm" event={"ID":"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c","Type":"ContainerStarted","Data":"293712498a93b9f2b2c63cc45b9b6308c536ddffc5cc352b7a4338713129e788"} Apr 24 21:17:33.097965 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.097723 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v4ffd" podStartSLOduration=33.751510523 podStartE2EDuration="38.097701238s" podCreationTimestamp="2026-04-24 21:16:55 +0000 UTC" firstStartedPulling="2026-04-24 21:17:27.913753649 +0000 UTC m=+65.981359207" lastFinishedPulling="2026-04-24 21:17:32.259944355 +0000 UTC m=+70.327549922" observedRunningTime="2026-04-24 21:17:33.096467332 +0000 UTC m=+71.164072908" watchObservedRunningTime="2026-04-24 21:17:33.097701238 +0000 UTC m=+71.165306818" Apr 24 21:17:33.136681 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.135770 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bhp8b" podStartSLOduration=53.617788466 podStartE2EDuration="58.135749083s" podCreationTimestamp="2026-04-24 21:16:35 +0000 UTC" firstStartedPulling="2026-04-24 21:17:27.74194662 +0000 UTC m=+65.809552178" lastFinishedPulling="2026-04-24 21:17:32.259907236 +0000 UTC m=+70.327512795" observedRunningTime="2026-04-24 21:17:33.135408779 +0000 UTC m=+71.203014439" watchObservedRunningTime="2026-04-24 21:17:33.135749083 +0000 UTC m=+71.203354663" Apr 24 21:17:33.153081 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.152988 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mxfwh" podStartSLOduration=33.75122401 podStartE2EDuration="38.152969835s" podCreationTimestamp="2026-04-24 21:16:55 +0000 UTC" firstStartedPulling="2026-04-24 21:17:27.85819351 +0000 UTC m=+65.925799068" lastFinishedPulling="2026-04-24 21:17:32.259939332 +0000 UTC m=+70.327544893" observedRunningTime="2026-04-24 21:17:33.15109661 +0000 UTC m=+71.218702201" watchObservedRunningTime="2026-04-24 21:17:33.152969835 +0000 UTC m=+71.220575413" Apr 24 21:17:33.169583 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.169531 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lx5kc" podStartSLOduration=53.783794369 podStartE2EDuration="58.169513021s" podCreationTimestamp="2026-04-24 21:16:35 +0000 UTC" firstStartedPulling="2026-04-24 21:17:27.874222586 +0000 UTC m=+65.941828142" lastFinishedPulling="2026-04-24 21:17:32.259941222 +0000 UTC m=+70.327546794" observedRunningTime="2026-04-24 21:17:33.167965786 +0000 UTC m=+71.235571361" watchObservedRunningTime="2026-04-24 21:17:33.169513021 +0000 UTC m=+71.237118600" Apr 24 21:17:33.188009 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:33.187891 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-wxcvf" podStartSLOduration=45.479320493 podStartE2EDuration="58.187866114s" podCreationTimestamp="2026-04-24 21:16:35 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.982470196 +0000 UTC m=+34.050075768" lastFinishedPulling="2026-04-24 21:17:08.691015832 +0000 UTC m=+46.758621389" observedRunningTime="2026-04-24 21:17:33.186185563 +0000 UTC m=+71.253791142" watchObservedRunningTime="2026-04-24 21:17:33.187866114 +0000 UTC m=+71.255471694" Apr 24 21:17:34.088384 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:34.088341 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xbxm" event={"ID":"bfa91d92-6b3a-44d1-8b22-abae9ded2a1c","Type":"ContainerStarted","Data":"0172d659a7f06197c0bb615b5215daaf6009c8cf6c2cd062a4c2d468cbce3c2b"} Apr 24 21:17:34.110217 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:34.110163 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4xbxm" podStartSLOduration=68.206692941 podStartE2EDuration="1m12.110146673s" podCreationTimestamp="2026-04-24 21:16:22 +0000 UTC" firstStartedPulling="2026-04-24 21:17:28.80136842 +0000 UTC m=+66.868973982" lastFinishedPulling="2026-04-24 21:17:32.704822152 +0000 UTC m=+70.772427714" observedRunningTime="2026-04-24 21:17:34.10851405 +0000 UTC m=+72.176119624" watchObservedRunningTime="2026-04-24 21:17:34.110146673 +0000 UTC m=+72.177752317" Apr 24 21:17:35.093380 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:35.093331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6n8rw" event={"ID":"04c1b2a9-55bb-42b1-8157-4885f8b97dc0","Type":"ContainerStarted","Data":"87f68b45fb16977f2aaee46d2b502610b2fd48ecad4f414d31eb5c1eb142e0ca"} Apr 24 21:17:35.113265 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:35.113205 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6n8rw" podStartSLOduration=17.598364027 podStartE2EDuration="24.1131898s" podCreationTimestamp="2026-04-24 21:17:11 +0000 UTC" firstStartedPulling="2026-04-24 21:17:27.877591005 +0000 UTC m=+65.945196562" lastFinishedPulling="2026-04-24 21:17:34.392416774 +0000 UTC m=+72.460022335" observedRunningTime="2026-04-24 21:17:35.111758399 +0000 UTC m=+73.179363979" watchObservedRunningTime="2026-04-24 21:17:35.1131898 +0000 UTC m=+73.180795379" Apr 24 21:17:41.963152 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:41.963118 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pnbsk" Apr 24 21:17:44.095441 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:44.095405 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v4ffd" Apr 24 21:17:46.879869 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.879829 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-d8gpx"] Apr 24 21:17:46.894285 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.894249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.897640 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.897608 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:17:46.898468 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.898443 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:17:46.899113 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.898661 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:17:46.899113 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.898697 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:17:46.899113 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.899042 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fm79n\"" Apr 24 21:17:46.925614 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-root\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.925964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925623 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-sys\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.925964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8qs9\" (UniqueName: \"kubernetes.io/projected/040f6ff5-58c2-4a5b-be11-5ba8475596fd-kube-api-access-d8qs9\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.925964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-wtmp\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.925964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-tls\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.925964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.925964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-textfile\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.925964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-accelerators-collector-config\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:46.926366 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:46.925894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/040f6ff5-58c2-4a5b-be11-5ba8475596fd-metrics-client-ca\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026428 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026621 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026439 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-textfile\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026683 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-accelerators-collector-config\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026738 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/040f6ff5-58c2-4a5b-be11-5ba8475596fd-metrics-client-ca\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026738 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-root\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026840 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-sys\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026840 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8qs9\" (UniqueName: \"kubernetes.io/projected/040f6ff5-58c2-4a5b-be11-5ba8475596fd-kube-api-access-d8qs9\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026840 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-textfile\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.026840 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-wtmp\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.027079 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-tls\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.027079 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.026920 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-wtmp\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.027079 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:47.027026 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:17:47.027237 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:17:47.027101 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-tls podName:040f6ff5-58c2-4a5b-be11-5ba8475596fd nodeName:}" failed. No retries permitted until 2026-04-24 21:17:47.527080782 +0000 UTC m=+85.594686341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-tls") pod "node-exporter-d8gpx" (UID: "040f6ff5-58c2-4a5b-be11-5ba8475596fd") : secret "node-exporter-tls" not found Apr 24 21:17:47.027332 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.027287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-accelerators-collector-config\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.027421 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.027383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-sys\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.027421 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.027388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/040f6ff5-58c2-4a5b-be11-5ba8475596fd-root\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.027531 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.027432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/040f6ff5-58c2-4a5b-be11-5ba8475596fd-metrics-client-ca\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.028964 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.028939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.036221 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.036194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8qs9\" (UniqueName: \"kubernetes.io/projected/040f6ff5-58c2-4a5b-be11-5ba8475596fd-kube-api-access-d8qs9\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.531697 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.531654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-tls\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.534118 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.534081 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/040f6ff5-58c2-4a5b-be11-5ba8475596fd-node-exporter-tls\") pod \"node-exporter-d8gpx\" (UID: \"040f6ff5-58c2-4a5b-be11-5ba8475596fd\") " pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.807022 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:47.806902 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d8gpx" Apr 24 21:17:47.817951 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:17:47.817890 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod040f6ff5_58c2_4a5b_be11_5ba8475596fd.slice/crio-7426d2d70eb2bc68dccac13b3286ddee96b7fc1a7ddeee6e5fc9520b30eb3e1d WatchSource:0}: Error finding container 7426d2d70eb2bc68dccac13b3286ddee96b7fc1a7ddeee6e5fc9520b30eb3e1d: Status 404 returned error can't find the container with id 7426d2d70eb2bc68dccac13b3286ddee96b7fc1a7ddeee6e5fc9520b30eb3e1d Apr 24 21:17:48.135823 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:48.135722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d8gpx" event={"ID":"040f6ff5-58c2-4a5b-be11-5ba8475596fd","Type":"ContainerStarted","Data":"7426d2d70eb2bc68dccac13b3286ddee96b7fc1a7ddeee6e5fc9520b30eb3e1d"} Apr 24 21:17:49.038019 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:49.037989 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:17:49.140432 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:49.140394 2578 generic.go:358] "Generic (PLEG): container finished" podID="040f6ff5-58c2-4a5b-be11-5ba8475596fd" containerID="e9cd41b166660fed7f28c7dcf4394e6e975ee71cb475ba17b4739cb62c756f81" exitCode=0 Apr 24 21:17:49.140907 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:49.140444 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d8gpx" event={"ID":"040f6ff5-58c2-4a5b-be11-5ba8475596fd","Type":"ContainerDied","Data":"e9cd41b166660fed7f28c7dcf4394e6e975ee71cb475ba17b4739cb62c756f81"} Apr 24 21:17:50.145709 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:50.145666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d8gpx" event={"ID":"040f6ff5-58c2-4a5b-be11-5ba8475596fd","Type":"ContainerStarted","Data":"6f89afdb082aecf6500970304bc49e328b6d4b3624ef538c84763955a7384a8c"} Apr 24 21:17:50.146118 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:50.145716 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d8gpx" event={"ID":"040f6ff5-58c2-4a5b-be11-5ba8475596fd","Type":"ContainerStarted","Data":"b6d1a7db8c3c321aee4917199fd666ae9ed2b43b34a6b26192ab70ece209b154"} Apr 24 21:17:50.169009 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:17:50.168940 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-d8gpx" podStartSLOduration=3.455214878 podStartE2EDuration="4.168904207s" podCreationTimestamp="2026-04-24 21:17:46 +0000 UTC" firstStartedPulling="2026-04-24 21:17:47.820059274 +0000 UTC m=+85.887664832" lastFinishedPulling="2026-04-24 21:17:48.5337486 +0000 UTC m=+86.601354161" observedRunningTime="2026-04-24 21:17:50.167523825 +0000 UTC m=+88.235129410" watchObservedRunningTime="2026-04-24 21:17:50.168904207 +0000 UTC m=+88.236509787" Apr 24 21:18:00.361714 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:00.361674 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6948fb494f-4hnkq"] Apr 24 21:18:25.261075 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.260971 2578 generic.go:358] "Generic (PLEG): container finished" podID="11dd689a-4bd3-402d-99a1-0a89fed0b025" containerID="25830fce8e1b4006ec4d0df666c57fb0a537020ffdc4312b2f939c95ce151d74" exitCode=0 Apr 24 21:18:25.261075 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.261050 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" event={"ID":"11dd689a-4bd3-402d-99a1-0a89fed0b025","Type":"ContainerDied","Data":"25830fce8e1b4006ec4d0df666c57fb0a537020ffdc4312b2f939c95ce151d74"} Apr 24 21:18:25.261538 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.261397 2578 scope.go:117] "RemoveContainer" containerID="25830fce8e1b4006ec4d0df666c57fb0a537020ffdc4312b2f939c95ce151d74" Apr 24 21:18:25.384879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.384830 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" podUID="afad5a5d-bb00-427a-ad3b-6ad4a899dd16" containerName="registry" containerID="cri-o://b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc" gracePeriod=30 Apr 24 21:18:25.634720 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.634696 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:18:25.764263 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764216 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-image-registry-private-configuration\") pod \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " Apr 24 21:18:25.764470 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764287 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-bound-sa-token\") pod \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " Apr 24 21:18:25.764470 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764314 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-trusted-ca\") pod \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " Apr 24 21:18:25.764470 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764346 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-ca-trust-extracted\") pod \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " Apr 24 21:18:25.764470 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764371 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-certificates\") pod \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " Apr 24 21:18:25.764470 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764438 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-installation-pull-secrets\") pod \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " Apr 24 21:18:25.764470 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764470 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtc9x\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-kube-api-access-wtc9x\") pod \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " Apr 24 21:18:25.764789 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764516 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") pod \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\" (UID: \"afad5a5d-bb00-427a-ad3b-6ad4a899dd16\") " Apr 24 21:18:25.765165 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.764847 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "afad5a5d-bb00-427a-ad3b-6ad4a899dd16" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:25.765276 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.765246 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "afad5a5d-bb00-427a-ad3b-6ad4a899dd16" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:25.767408 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.767357 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "afad5a5d-bb00-427a-ad3b-6ad4a899dd16" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:25.767408 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.767357 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-kube-api-access-wtc9x" (OuterVolumeSpecName: "kube-api-access-wtc9x") pod "afad5a5d-bb00-427a-ad3b-6ad4a899dd16" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16"). InnerVolumeSpecName "kube-api-access-wtc9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:25.767568 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.767412 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "afad5a5d-bb00-427a-ad3b-6ad4a899dd16" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:25.767568 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.767429 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "afad5a5d-bb00-427a-ad3b-6ad4a899dd16" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:25.767568 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.767451 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "afad5a5d-bb00-427a-ad3b-6ad4a899dd16" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:25.774136 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.774092 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "afad5a5d-bb00-427a-ad3b-6ad4a899dd16" (UID: "afad5a5d-bb00-427a-ad3b-6ad4a899dd16"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:18:25.865903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.865813 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-image-registry-private-configuration\") on node \"ip-10-0-132-219.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.865903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.865846 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-bound-sa-token\") on node \"ip-10-0-132-219.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.865903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.865860 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-trusted-ca\") on node \"ip-10-0-132-219.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.865903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.865868 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-ca-trust-extracted\") on node \"ip-10-0-132-219.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.865903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.865878 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-certificates\") on node \"ip-10-0-132-219.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.865903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.865888 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-installation-pull-secrets\") on node \"ip-10-0-132-219.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.865903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.865897 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wtc9x\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-kube-api-access-wtc9x\") on node \"ip-10-0-132-219.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.865903 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:25.865908 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afad5a5d-bb00-427a-ad3b-6ad4a899dd16-registry-tls\") on node \"ip-10-0-132-219.ec2.internal\" DevicePath \"\"" Apr 24 21:18:26.266653 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.266618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2dptz" event={"ID":"11dd689a-4bd3-402d-99a1-0a89fed0b025","Type":"ContainerStarted","Data":"354c20e5a5e000603d2df437c3995d093094a675803c484c9a03fa068d3bf409"} Apr 24 21:18:26.267817 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.267791 2578 generic.go:358] "Generic (PLEG): container finished" podID="afad5a5d-bb00-427a-ad3b-6ad4a899dd16" containerID="b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc" exitCode=0 Apr 24 21:18:26.267902 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.267845 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" Apr 24 21:18:26.267902 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.267856 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" event={"ID":"afad5a5d-bb00-427a-ad3b-6ad4a899dd16","Type":"ContainerDied","Data":"b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc"} Apr 24 21:18:26.267902 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.267880 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6948fb494f-4hnkq" event={"ID":"afad5a5d-bb00-427a-ad3b-6ad4a899dd16","Type":"ContainerDied","Data":"df552912e815049c67525343bab51e45261a42ab4908261d5d6a635cbc6cbe5b"} Apr 24 21:18:26.267902 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.267895 2578 scope.go:117] "RemoveContainer" containerID="b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc" Apr 24 21:18:26.277461 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.277436 2578 scope.go:117] "RemoveContainer" containerID="b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc" Apr 24 21:18:26.277803 ip-10-0-132-219 kubenswrapper[2578]: E0424 21:18:26.277770 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc\": container with ID starting with b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc not found: ID does not exist" containerID="b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc" Apr 24 21:18:26.277856 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.277814 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc"} err="failed to get container status \"b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc\": rpc error: code = NotFound desc = could not find container \"b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc\": container with ID starting with b3238555ce5ca958f0dcfd805590ecc267d0d9eaa83f23b6286f9bccd62156cc not found: ID does not exist" Apr 24 21:18:26.301643 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.301605 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6948fb494f-4hnkq"] Apr 24 21:18:26.303572 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.303543 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6948fb494f-4hnkq"] Apr 24 21:18:26.528278 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:26.528197 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afad5a5d-bb00-427a-ad3b-6ad4a899dd16" path="/var/lib/kubelet/pods/afad5a5d-bb00-427a-ad3b-6ad4a899dd16/volumes" Apr 24 21:18:31.284110 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:31.284074 2578 generic.go:358] "Generic (PLEG): container finished" podID="940bb44f-905e-43d7-a9d4-7cb49ef94c0e" containerID="c9b3d553c396b36fc155c1feeba8391dc3cbfa1151cc5acb11da3aa4e19bb695" exitCode=0 Apr 24 21:18:31.284534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:31.284157 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" event={"ID":"940bb44f-905e-43d7-a9d4-7cb49ef94c0e","Type":"ContainerDied","Data":"c9b3d553c396b36fc155c1feeba8391dc3cbfa1151cc5acb11da3aa4e19bb695"} Apr 24 21:18:31.284534 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:31.284489 2578 scope.go:117] "RemoveContainer" containerID="c9b3d553c396b36fc155c1feeba8391dc3cbfa1151cc5acb11da3aa4e19bb695" Apr 24 21:18:32.288692 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:32.288657 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l4p7x" event={"ID":"940bb44f-905e-43d7-a9d4-7cb49ef94c0e","Type":"ContainerStarted","Data":"fd695f906326caa42981ec99f54cc06a465fa5099a0df4f2000622dd01ccf229"} Apr 24 21:18:35.872735 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:35.872670 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" podUID="5e304870-95cc-4401-8456-8388b7b9d759" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:44.325459 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:44.325420 2578 generic.go:358] "Generic (PLEG): container finished" podID="4306556d-2b6a-4b8d-b6ef-8342d2ce44d8" containerID="e09d262941f80c24d042e1b8febde369f9c198cfd873ad6593edf14a37c3f88f" exitCode=0 Apr 24 21:18:44.325836 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:44.325489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bm7ns" event={"ID":"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8","Type":"ContainerDied","Data":"e09d262941f80c24d042e1b8febde369f9c198cfd873ad6593edf14a37c3f88f"} Apr 24 21:18:44.325880 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:44.325865 2578 scope.go:117] "RemoveContainer" containerID="e09d262941f80c24d042e1b8febde369f9c198cfd873ad6593edf14a37c3f88f" Apr 24 21:18:45.330633 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:45.330594 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bm7ns" event={"ID":"4306556d-2b6a-4b8d-b6ef-8342d2ce44d8","Type":"ContainerStarted","Data":"aff518b3b8512029d212079a42ae4a6afdca279b5020c7ac35cd14360000b5cb"} Apr 24 21:18:45.872792 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:45.872752 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" podUID="5e304870-95cc-4401-8456-8388b7b9d759" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:55.872601 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:55.872550 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" podUID="5e304870-95cc-4401-8456-8388b7b9d759" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:55.873045 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:55.872642 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" Apr 24 21:18:55.873369 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:55.873344 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"395a5e38728c7becbdd76381e070026a5dec3bd0f27036d871ddb0416aaad1dc"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:18:55.873420 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:55.873402 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" podUID="5e304870-95cc-4401-8456-8388b7b9d759" containerName="service-proxy" containerID="cri-o://395a5e38728c7becbdd76381e070026a5dec3bd0f27036d871ddb0416aaad1dc" gracePeriod=30 Apr 24 21:18:56.367654 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:56.367620 2578 generic.go:358] "Generic (PLEG): container finished" podID="5e304870-95cc-4401-8456-8388b7b9d759" containerID="395a5e38728c7becbdd76381e070026a5dec3bd0f27036d871ddb0416aaad1dc" exitCode=2 Apr 24 21:18:56.367849 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:56.367695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" event={"ID":"5e304870-95cc-4401-8456-8388b7b9d759","Type":"ContainerDied","Data":"395a5e38728c7becbdd76381e070026a5dec3bd0f27036d871ddb0416aaad1dc"} Apr 24 21:18:56.367849 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:18:56.367744 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5df456cfb5-vv7p4" event={"ID":"5e304870-95cc-4401-8456-8388b7b9d759","Type":"ContainerStarted","Data":"f91a7dfe8fb583d1418fa800b287d4cb270635189e51ed4ce9101b0c760a21d6"} Apr 24 21:21:22.463189 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:21:22.463157 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:21:22.463738 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:21:22.463230 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:21:22.471966 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:21:22.471917 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:26:22.492951 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:26:22.492902 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:26:22.493624 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:26:22.493602 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:31:22.514023 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:31:22.513942 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:31:22.516034 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:31:22.516009 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:36:22.535303 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:36:22.535270 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:36:22.537815 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:36:22.537789 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:38:02.721062 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:02.721025 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-c7xmt_e57ddd4a-8801-45ab-b463-3398cdeea471/global-pull-secret-syncer/0.log" Apr 24 21:38:02.967283 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:02.967248 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q9jpr_647bf41d-3e57-40c7-bd39-3154d24499dd/konnectivity-agent/0.log" Apr 24 21:38:03.028475 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:03.028383 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-219.ec2.internal_88cded07b618f16d406fa0098b2baa8d/haproxy/0.log" Apr 24 21:38:06.659819 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:06.659704 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-bhp8b_fe5f6e68-6c62-42ed-8fb3-c8b29635d5bb/cluster-monitoring-operator/0.log" Apr 24 21:38:06.934738 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:06.934704 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d8gpx_040f6ff5-58c2-4a5b-be11-5ba8475596fd/node-exporter/0.log" Apr 24 21:38:06.982032 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:06.981991 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d8gpx_040f6ff5-58c2-4a5b-be11-5ba8475596fd/kube-rbac-proxy/0.log" Apr 24 21:38:07.026469 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:07.026441 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d8gpx_040f6ff5-58c2-4a5b-be11-5ba8475596fd/init-textfile/0.log" Apr 24 21:38:09.116276 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.116248 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-jq9c6_e58045bb-0010-4145-94ce-dd45fc5b114f/networking-console-plugin/0.log" Apr 24 21:38:09.551897 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.550947 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/1.log" Apr 24 21:38:09.562459 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.562424 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wxcvf_158622c6-3d53-4f0c-bd64-00a4f1f32d69/console-operator/2.log" Apr 24 21:38:09.665401 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.665362 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9"] Apr 24 21:38:09.665830 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.665812 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afad5a5d-bb00-427a-ad3b-6ad4a899dd16" containerName="registry" Apr 24 21:38:09.665883 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.665834 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="afad5a5d-bb00-427a-ad3b-6ad4a899dd16" containerName="registry" Apr 24 21:38:09.665962 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.665949 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="afad5a5d-bb00-427a-ad3b-6ad4a899dd16" containerName="registry" Apr 24 21:38:09.668868 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.668850 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.671280 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.671246 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pg6qz\"/\"openshift-service-ca.crt\"" Apr 24 21:38:09.671280 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.671264 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pg6qz\"/\"kube-root-ca.crt\"" Apr 24 21:38:09.671467 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.671371 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pg6qz\"/\"default-dockercfg-xvlcd\"" Apr 24 21:38:09.680357 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.680326 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9"] Apr 24 21:38:09.786252 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.786217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-lib-modules\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.786252 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.786253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdqv\" (UniqueName: \"kubernetes.io/projected/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-kube-api-access-mfdqv\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.786506 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.786296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-podres\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.786506 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.786331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-sys\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.786506 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.786354 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-proc\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887082 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.886981 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-proc\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887082 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.887063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-lib-modules\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.887093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdqv\" (UniqueName: \"kubernetes.io/projected/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-kube-api-access-mfdqv\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.887117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-proc\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.887120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-podres\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.887199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-sys\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.887233 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-lib-modules\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.887275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-sys\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.887320 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.887284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-podres\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.902388 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.902356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdqv\" (UniqueName: \"kubernetes.io/projected/ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d-kube-api-access-mfdqv\") pod \"perf-node-gather-daemonset-cvjz9\" (UID: \"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:09.979956 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:09.979888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:10.115274 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:10.115237 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9"] Apr 24 21:38:10.118681 ip-10-0-132-219 kubenswrapper[2578]: W0424 21:38:10.118639 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podca5e13bf_1a6f_43bf_bd14_a8cf48a1dd7d.slice/crio-90da6436afa114b9c74dc9a75bafee653524e84df244cd90998cdabe6ce51cf3 WatchSource:0}: Error finding container 90da6436afa114b9c74dc9a75bafee653524e84df244cd90998cdabe6ce51cf3: Status 404 returned error can't find the container with id 90da6436afa114b9c74dc9a75bafee653524e84df244cd90998cdabe6ce51cf3 Apr 24 21:38:10.120483 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:10.120464 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:38:10.508710 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:10.508673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-6l5v9_b7e9209d-9d38-4fb0-a8ad-003c896fe276/volume-data-source-validator/0.log" Apr 24 21:38:10.793550 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:10.793458 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" event={"ID":"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d","Type":"ContainerStarted","Data":"506481f0d4253a7bf52038c66f725a557c13485be18fa473ef060cf6387c1a2a"} Apr 24 21:38:10.793550 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:10.793494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" event={"ID":"ca5e13bf-1a6f-43bf-bd14-a8cf48a1dd7d","Type":"ContainerStarted","Data":"90da6436afa114b9c74dc9a75bafee653524e84df244cd90998cdabe6ce51cf3"} Apr 24 21:38:10.793785 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:10.793563 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:10.812274 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:10.812224 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" podStartSLOduration=1.8122076059999999 podStartE2EDuration="1.812207606s" podCreationTimestamp="2026-04-24 21:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:38:10.810990318 +0000 UTC m=+1308.878595893" watchObservedRunningTime="2026-04-24 21:38:10.812207606 +0000 UTC m=+1308.879813185" Apr 24 21:38:11.454343 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:11.454310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v4ffd_dd6441e0-b9f0-483a-95f8-56bff0d86e71/dns/0.log" Apr 24 21:38:11.484498 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:11.484463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v4ffd_dd6441e0-b9f0-483a-95f8-56bff0d86e71/kube-rbac-proxy/0.log" Apr 24 21:38:11.511095 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:11.511063 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hdlfg_42a9166c-4a9e-42ec-8194-8b7b3852c9a2/dns-node-resolver/0.log" Apr 24 21:38:12.045156 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:12.045126 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7p6s9_5f9a3a11-4bde-44e9-a68d-2d9ababf72d3/node-ca/0.log" Apr 24 21:38:12.887631 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:12.887590 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b7858ff9b-qlsj4_ca1e5038-3f4a-4f35-ac48-63b7f9fa576a/router/0.log" Apr 24 21:38:13.323637 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:13.323606 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mxfwh_7321b947-9a04-45eb-a042-a216d960cbb7/serve-healthcheck-canary/0.log" Apr 24 21:38:13.800431 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:13.800387 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-bm7ns_4306556d-2b6a-4b8d-b6ef-8342d2ce44d8/insights-operator/1.log" Apr 24 21:38:13.800686 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:13.800665 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-bm7ns_4306556d-2b6a-4b8d-b6ef-8342d2ce44d8/insights-operator/0.log" Apr 24 21:38:13.825297 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:13.825266 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6n8rw_04c1b2a9-55bb-42b1-8157-4885f8b97dc0/kube-rbac-proxy/0.log" Apr 24 21:38:13.847301 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:13.847269 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6n8rw_04c1b2a9-55bb-42b1-8157-4885f8b97dc0/exporter/0.log" Apr 24 21:38:13.870074 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:13.870040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6n8rw_04c1b2a9-55bb-42b1-8157-4885f8b97dc0/extractor/0.log" Apr 24 21:38:16.808149 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:16.808121 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-cvjz9" Apr 24 21:38:20.550706 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:20.550670 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-2sqrs_be760740-4d3a-496f-91a2-cef6e63a178b/migrator/0.log" Apr 24 21:38:20.579202 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:20.579166 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-2sqrs_be760740-4d3a-496f-91a2-cef6e63a178b/graceful-termination/0.log" Apr 24 21:38:20.897336 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:20.897297 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2dptz_11dd689a-4bd3-402d-99a1-0a89fed0b025/kube-storage-version-migrator-operator/1.log" Apr 24 21:38:20.898832 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:20.898799 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2dptz_11dd689a-4bd3-402d-99a1-0a89fed0b025/kube-storage-version-migrator-operator/0.log" Apr 24 21:38:21.766975 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:21.766918 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6fjn7_b8465e7e-a69f-4eff-bf07-d8e7f8de3cdb/kube-multus/0.log" Apr 24 21:38:22.138560 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.138480 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wpmrx_d1b4f3db-7875-478c-93b8-7c3155edd974/kube-multus-additional-cni-plugins/0.log" Apr 24 21:38:22.161247 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.161213 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wpmrx_d1b4f3db-7875-478c-93b8-7c3155edd974/egress-router-binary-copy/0.log" Apr 24 21:38:22.186768 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.186734 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wpmrx_d1b4f3db-7875-478c-93b8-7c3155edd974/cni-plugins/0.log" Apr 24 21:38:22.206809 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.206778 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wpmrx_d1b4f3db-7875-478c-93b8-7c3155edd974/bond-cni-plugin/0.log" Apr 24 21:38:22.229326 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.229267 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wpmrx_d1b4f3db-7875-478c-93b8-7c3155edd974/routeoverride-cni/0.log" Apr 24 21:38:22.259139 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.259105 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wpmrx_d1b4f3db-7875-478c-93b8-7c3155edd974/whereabouts-cni-bincopy/0.log" Apr 24 21:38:22.287166 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.287129 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wpmrx_d1b4f3db-7875-478c-93b8-7c3155edd974/whereabouts-cni/0.log" Apr 24 21:38:22.386195 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.386144 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4xbxm_bfa91d92-6b3a-44d1-8b22-abae9ded2a1c/network-metrics-daemon/0.log" Apr 24 21:38:22.410374 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:22.410288 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4xbxm_bfa91d92-6b3a-44d1-8b22-abae9ded2a1c/kube-rbac-proxy/0.log" Apr 24 21:38:23.994938 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:23.994905 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqvlx_74c3a396-cd8f-4290-9e5d-1a182b254157/ovn-controller/0.log" Apr 24 21:38:24.029349 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:24.029311 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqvlx_74c3a396-cd8f-4290-9e5d-1a182b254157/ovn-acl-logging/0.log" Apr 24 21:38:24.053662 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:24.053632 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqvlx_74c3a396-cd8f-4290-9e5d-1a182b254157/kube-rbac-proxy-node/0.log" Apr 24 21:38:24.076435 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:24.076398 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqvlx_74c3a396-cd8f-4290-9e5d-1a182b254157/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:38:24.096879 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:24.096849 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqvlx_74c3a396-cd8f-4290-9e5d-1a182b254157/northd/0.log" Apr 24 21:38:24.123877 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:24.123843 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqvlx_74c3a396-cd8f-4290-9e5d-1a182b254157/nbdb/0.log" Apr 24 21:38:24.190568 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:24.190536 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqvlx_74c3a396-cd8f-4290-9e5d-1a182b254157/sbdb/0.log" Apr 24 21:38:24.375639 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:24.375547 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqvlx_74c3a396-cd8f-4290-9e5d-1a182b254157/ovnkube-controller/0.log" Apr 24 21:38:25.272824 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:25.272784 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-2hr4w_8408d184-7b3a-4075-9db8-5cb9fae0821c/check-endpoints/0.log" Apr 24 21:38:25.329218 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:25.329184 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-pnbsk_763a8f03-407c-4bd4-b683-27ba0614f163/network-check-target-container/0.log" Apr 24 21:38:26.320536 ip-10-0-132-219 kubenswrapper[2578]: I0424 21:38:26.320511 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fr8rd_4296bff6-4cb8-423e-a188-d5e73736c322/iptables-alerter/0.log"