Apr 22 19:23:30.815914 ip-10-0-131-132 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:31.317537 ip-10-0-131-132 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:31.317537 ip-10-0-131-132 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:31.317537 ip-10-0-131-132 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:31.317537 ip-10-0-131-132 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:31.317537 ip-10-0-131-132 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:31.320422 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.320319 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:31.328542 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328508 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.328542 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328536 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.328542 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328541 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.328542 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328545 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.328542 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328548 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.328542 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328551 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.328542 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328556 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.328542 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328559 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328562 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328565 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328568 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328583 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328586 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328588 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328591 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328595 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328599 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328603 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328607 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328610 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328621 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328624 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328627 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328631 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328635 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328638 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.328864 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328641 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328643 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328646 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328649 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328652 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328654 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328657 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328660 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328662 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328665 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328668 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328671 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328675 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328677 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328680 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328684 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328688 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328690 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328693 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328695 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.329322 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328698 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328702 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328704 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328707 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328710 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328712 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328715 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328717 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328720 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328723 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328726 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328728 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328731 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328733 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328736 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328739 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328741 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328744 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328746 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328749 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.329841 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328751 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328754 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328762 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328765 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328768 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328770 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328773 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328777 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328780 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328782 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328785 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328787 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328792 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328794 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328797 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328800 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328803 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328805 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328808 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.330344 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.328810 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329256 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329262 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329266 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329269 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329272 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329275 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329277 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329280 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329283 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329286 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329288 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329291 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329294 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329296 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329299 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329302 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329306 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329309 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329311 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.330822 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329315 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329318 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329321 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329324 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329327 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329330 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329333 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329335 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329338 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329340 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329343 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329345 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329348 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329351 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329353 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329356 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329358 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329362 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329364 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329367 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.331340 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329369 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329372 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329375 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329377 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329380 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329382 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329384 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329387 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329390 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329393 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329396 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329399 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329401 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329404 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329407 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329410 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329412 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329415 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329418 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329421 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.331912 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329424 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329426 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329428 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329431 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329434 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329436 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329439 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329442 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329444 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329447 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329450 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329452 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329455 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329457 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329460 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329463 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329465 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329468 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329470 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.332406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329473 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329477 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329480 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329482 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329485 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329488 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329490 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.329493 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329586 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329595 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329602 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329606 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329623 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329627 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329636 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329641 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329644 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329648 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329651 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329655 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329658 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329661 2570 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329664 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:31.332896 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329667 2570 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329670 2570 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329673 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329676 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329680 2570 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329683 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329686 2570 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329690 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329694 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329698 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329701 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329704 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329708 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329711 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329714 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329717 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329720 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329723 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329728 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329731 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329734 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329737 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329740 2570 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329744 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329749 2570 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:31.333534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329752 2570 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329755 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329758 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329762 2570 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329766 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329769 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329772 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329775 2570 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329778 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329781 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329785 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329788 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329791 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329794 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329797 2570 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329801 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329805 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329808 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329811 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329814 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329817 2570 flags.go:64] FLAG: --help="false" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329820 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329823 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329827 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:31.334163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329830 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329833 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329837 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329840 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329843 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329846 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329850 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329853 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329857 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329859 2570 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329862 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329865 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329873 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329876 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329879 2570 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329882 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329885 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329889 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329895 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329898 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329901 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329904 2570 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329906 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329910 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:31.334758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329913 2570 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329916 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329921 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329924 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329928 2570 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329931 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329934 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329937 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329940 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329943 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329946 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329949 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329957 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329961 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329964 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329968 2570 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329971 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329977 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329980 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329983 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329987 2570 flags.go:64] FLAG: --port="10250" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329991 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329994 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0978cd31596d5a71a" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.329997 2570 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:31.335320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330000 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330004 2570 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330007 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330010 2570 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330013 2570 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330016 2570 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330019 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330022 2570 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330027 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330030 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330033 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330036 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330039 2570 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330041 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330045 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330048 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330051 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330054 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330057 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330060 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330065 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330068 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330071 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330074 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330078 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330081 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:31.335936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330084 2570 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330087 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330093 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330097 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330100 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330104 2570 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330108 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330111 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330114 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330117 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330120 2570 flags.go:64] FLAG: --v="2" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330125 2570 flags.go:64] FLAG: --version="false" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330129 2570 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330133 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.330137 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330252 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330257 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330260 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330262 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330266 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330268 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330271 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330274 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.336591 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330276 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330279 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330282 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330286 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330288 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330291 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330294 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330296 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330301 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330305 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330308 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330310 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330315 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330318 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330321 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330324 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330327 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330330 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330333 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330336 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.337160 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330339 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330342 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330344 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330347 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330351 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330354 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330357 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330360 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330363 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330365 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330368 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330371 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330374 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330376 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330379 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330384 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330387 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330389 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330392 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330395 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.337725 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330398 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330401 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330403 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330406 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330409 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330412 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330414 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330417 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330420 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330422 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330425 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330427 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330430 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330433 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330435 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330438 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330440 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330443 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330446 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.338228 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330450 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330453 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330456 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330458 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330461 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330463 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330466 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330468 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330473 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330475 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330478 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330480 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330483 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330486 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330488 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330491 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330494 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330497 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.338718 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.330500 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.331311 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.338590 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.338612 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338705 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338711 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338715 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338718 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338721 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338724 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338727 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338731 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338734 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338737 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338740 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.339172 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338743 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338745 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338748 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338751 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338754 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338756 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338759 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338761 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338764 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338767 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338770 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338772 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338775 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338778 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338782 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338785 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338787 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338790 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338792 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338795 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.339594 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338798 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338801 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338803 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338806 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338809 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338812 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338815 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338818 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338821 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338824 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338826 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338829 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338832 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338836 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338840 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338843 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338846 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338849 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338851 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.340100 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338854 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338857 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338860 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338862 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338865 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338868 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338872 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338875 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338878 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338881 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338884 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338886 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338890 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338893 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338896 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338899 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338902 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338904 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338907 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338909 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.340564 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338912 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338915 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338917 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338921 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338925 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338928 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338931 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338934 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338936 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338939 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338941 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338944 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338947 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338949 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338952 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.341066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.338955 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.338960 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339085 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339089 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339096 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339099 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339105 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339108 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339111 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339113 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339116 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339119 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339122 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339125 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339127 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.341475 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339130 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339132 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339137 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339140 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339143 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339147 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339150 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339152 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339155 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339157 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339160 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339162 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339165 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339168 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339171 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339175 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339178 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339181 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339183 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.341952 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339187 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339189 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339192 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339195 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339197 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339205 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339208 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339211 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339214 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339217 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339219 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339222 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339225 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339227 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339230 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339232 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339235 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339238 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339240 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339243 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.342406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339245 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339247 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339250 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339252 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339255 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339257 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339260 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339263 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339265 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339268 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339270 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339273 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339276 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339278 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339281 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339284 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339287 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339289 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339298 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339300 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.342911 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339303 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339306 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339309 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339312 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339314 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339317 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339320 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339322 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339325 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339328 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339331 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339333 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339336 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:31.339339 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.339344 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:31.343394 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.340123 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:31.343802 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.342699 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:31.343802 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.343716 2570 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:31.343862 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.343814 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:31.343862 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.343853 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:31.368677 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.368650 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:31.370948 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.370927 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:31.396279 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.396243 2570 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:31.405298 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.405273 2570 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:31.407087 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.407062 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:31.407200 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.407108 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:31.412769 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.412742 2570 fs.go:135] Filesystem UUIDs: map[35fc5507-6e01-4b62-b1d0-1990bbeff5f6:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 941a45c9-0752-4ed0-9079-9f3db5b13dcc:/dev/nvme0n1p4] Apr 22 19:23:31.412857 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.412768 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:31.419723 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.419588 2570 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:31.417305239 +0000 UTC m=+0.467668720 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101916 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec266f2cf11122daeb9b509330ab8bb9 SystemUUID:ec266f2c-f111-22da-eb9b-509330ab8bb9 BootID:fff8df87-f024-4159-8a4c-5406d109bc4c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2a:ea:db:5b:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2a:ea:db:5b:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:0d:a4:0e:f6:2a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:31.419723 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.419712 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:31.419848 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.419811 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:31.422320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.422285 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:31.422484 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.422323 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-132.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:31.422553 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.422492 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:31.422553 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.422502 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:31.422553 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.422519 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:31.422711 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.422684 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wp6k9" Apr 22 19:23:31.424278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.424262 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:31.425975 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.425959 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:31.426139 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.426128 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:31.428833 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.428820 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:31.428833 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.428838 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:31.428944 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.428854 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:31.428944 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.428867 2570 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:31.428944 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.428876 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:31.430191 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.430177 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:31.430234 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.430198 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:31.431017 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.431002 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wp6k9" Apr 22 19:23:31.434740 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.434719 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:31.436415 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.436400 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:31.438375 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438363 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:31.438422 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438384 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:31.438422 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438394 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:31.438422 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438400 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:31.438422 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438406 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:31.438422 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438412 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:31.438422 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438423 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:31.438586 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438429 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:31.438586 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438436 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:31.438586 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438443 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:31.438586 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438456 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:31.438586 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.438465 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:31.439298 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.439287 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:31.439333 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.439300 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:31.443250 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.443229 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:31.443351 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.443286 2570 server.go:1295] "Started kubelet" Apr 22 19:23:31.443846 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.443814 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:31.444084 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.444022 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:31.444178 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.444114 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:31.444259 ip-10-0-131-132 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:31.446255 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.446235 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:31.447378 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.447355 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:31.447776 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.447763 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:31.448840 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.448815 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:31.452823 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.452806 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-132.ec2.internal" not found Apr 22 19:23:31.452969 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.452954 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:31.453023 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.452969 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:31.454806 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:31.454782 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-132.ec2.internal\" not found" Apr 22 19:23:31.455470 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.455452 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:31.455470 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.455456 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:31.455617 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.455480 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:31.455617 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.455494 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:31.455714 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.455633 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:31.455714 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.455646 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:31.458714 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:31.458693 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-132.ec2.internal\" not found" node="ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.458815 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.458725 2570 factory.go:55] Registering systemd factory Apr 22 19:23:31.458815 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.458768 2570 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:31.458988 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.458977 2570 factory.go:153] Registering CRI-O factory Apr 22 19:23:31.458988 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.458989 2570 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:31.459055 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.459038 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:31.459085 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.459058 2570 factory.go:103] Registering Raw factory Apr 22 19:23:31.459085 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.459072 2570 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:31.459488 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.459476 2570 manager.go:319] Starting recovery of all containers Apr 22 19:23:31.459992 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:31.459966 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:31.469841 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.469710 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-132.ec2.internal" not found Apr 22 19:23:31.469841 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.469780 2570 manager.go:324] Recovery completed Apr 22 19:23:31.473874 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.473858 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:31.476112 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.476096 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-132.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:31.476179 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.476126 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-132.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:31.476179 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.476139 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-132.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:31.476627 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.476615 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:31.476679 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.476626 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:31.476679 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.476645 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:31.481273 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.481260 2570 policy_none.go:49] "None policy: Start" Apr 22 19:23:31.481313 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.481277 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:31.481313 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.481287 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.523192 2570 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:31.523246 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.523259 2570 server.go:85] "Starting device plugin registration server" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.523560 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.523589 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.523681 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.523757 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.523768 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:31.524317 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:31.524354 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-132.ec2.internal\" not found" Apr 22 19:23:31.532928 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.527224 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-132.ec2.internal" not found Apr 22 19:23:31.589150 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.589061 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:31.590330 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.590310 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:31.590398 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.590347 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:31.590398 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.590371 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:31.590398 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.590380 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:31.590541 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:31.590425 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:31.593021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.592998 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:31.624486 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.624458 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:31.625621 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.625606 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-132.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:31.625684 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.625638 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-132.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:31.625684 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.625653 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-132.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:31.625684 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.625679 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.635428 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.635410 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.690728 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.690693 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal"] Apr 22 19:23:31.693582 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.693554 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.693582 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.693564 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.718172 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.718145 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.721737 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.721721 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.747249 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.747227 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:31.749675 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.749656 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:31.757844 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.757818 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a12817d60fd512a01cb9024b96513bd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal\" (UID: \"a12817d60fd512a01cb9024b96513bd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.757945 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.757853 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a12817d60fd512a01cb9024b96513bd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal\" (UID: \"a12817d60fd512a01cb9024b96513bd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.757945 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.757881 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e33fdcedf15ae63b7d7592c581d6241e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-132.ec2.internal\" (UID: \"e33fdcedf15ae63b7d7592c581d6241e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.858244 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.858221 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a12817d60fd512a01cb9024b96513bd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal\" (UID: \"a12817d60fd512a01cb9024b96513bd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.858318 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.858252 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a12817d60fd512a01cb9024b96513bd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal\" (UID: \"a12817d60fd512a01cb9024b96513bd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.858318 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.858270 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e33fdcedf15ae63b7d7592c581d6241e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-132.ec2.internal\" (UID: \"e33fdcedf15ae63b7d7592c581d6241e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.858318 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.858316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e33fdcedf15ae63b7d7592c581d6241e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-132.ec2.internal\" (UID: \"e33fdcedf15ae63b7d7592c581d6241e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.858435 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.858323 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a12817d60fd512a01cb9024b96513bd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal\" (UID: \"a12817d60fd512a01cb9024b96513bd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:31.858435 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:31.858344 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a12817d60fd512a01cb9024b96513bd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal\" (UID: \"a12817d60fd512a01cb9024b96513bd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:32.049908 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.049852 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" Apr 22 19:23:32.052642 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.052613 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" Apr 22 19:23:32.343664 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.343624 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:32.344369 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.343777 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:32.344369 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.343785 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:32.344369 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.343777 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:32.429360 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.429320 2570 apiserver.go:52] "Watching apiserver" Apr 22 19:23:32.433197 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.433168 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:31 +0000 UTC" deadline="2027-11-18 12:49:22.203204607 +0000 UTC" Apr 22 19:23:32.433251 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.433199 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13793h25m49.770009179s" Apr 22 19:23:32.438366 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.438350 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:32.438724 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.438703 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-gkcvh","kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l","openshift-dns/node-resolver-8w5lr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal","openshift-multus/multus-qnx9f","openshift-network-operator/iptables-alerter-mgsvn","openshift-ovn-kubernetes/ovnkube-node-cszr4","openshift-cluster-node-tuning-operator/tuned-8cg7f","openshift-image-registry/node-ca-6gbdc","openshift-multus/multus-additional-cni-plugins-kxfcc","openshift-multus/network-metrics-daemon-gwm2k","openshift-network-diagnostics/network-check-target-mch7s"] Apr 22 19:23:32.441485 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.441468 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:32.442669 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.442648 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.443975 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.443942 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.444053 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.444020 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.444215 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.444178 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:32.444949 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.444780 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x9h4c\"" Apr 22 19:23:32.445936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.445399 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:32.445936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.445464 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-z2btp\"" Apr 22 19:23:32.445936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.445406 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:32.445936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.445765 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.445936 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.445788 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:32.446267 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.446249 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:32.446824 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.446804 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b9xkx\"" Apr 22 19:23:32.446931 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.446804 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:32.446931 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.446811 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:32.447481 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.447463 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.447767 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.447645 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:32.447856 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.447838 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:32.447856 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.447850 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:32.447973 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.447852 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-t7m4l\"" Apr 22 19:23:32.448562 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.448548 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:32.448892 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.448873 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.449057 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.449039 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:32.449131 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.449044 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:32.449388 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.449373 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-d7xfq\"" Apr 22 19:23:32.449440 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.449396 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:32.450444 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.450424 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.450555 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.450471 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:32.450640 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.450557 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:32.450640 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.450584 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:32.450729 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.450561 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:32.450954 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.450941 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vrcmd\"" Apr 22 19:23:32.450987 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.450959 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:32.451032 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.450960 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:32.451883 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.451863 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:32.451954 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.451866 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.452512 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.452497 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:32.452623 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.452529 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xn94d\"" Apr 22 19:23:32.453049 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.453033 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:32.453307 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.453288 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:32.453528 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.453508 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:32.453612 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:32.453589 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:32.453788 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.453769 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:32.454266 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.454253 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x97sx\"" Apr 22 19:23:32.454625 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.454602 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:32.454831 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.454815 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:32.454896 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:32.454878 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:32.456317 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.456289 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:32.456392 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.456359 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:32.456539 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.456520 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:32.456644 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.456604 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rswps\"" Apr 22 19:23:32.460924 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.460903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-os-release\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.461060 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.460928 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-ovn\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.461060 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.460944 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-cni-bin\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.461060 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.460964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysctl-conf\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.461060 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.460988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-registration-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.461060 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4k2\" (UniqueName: \"kubernetes.io/projected/9d618274-e61e-4ac5-b98d-0316d3addc15-kube-api-access-xk4k2\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.461060 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461035 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.461060 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461055 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovnkube-script-lib\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-sys\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461097 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-multus-certs\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461114 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-k8s-cni-cncf-io\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461160 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-cni-multus\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461193 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-systemd\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-netns\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-kubelet\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461283 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-tuned\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461336 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-serviceca\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.461381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461363 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-system-cni-dir\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g9gp\" (UniqueName: \"kubernetes.io/projected/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-kube-api-access-8g9gp\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461416 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461447 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvjc\" (UniqueName: \"kubernetes.io/projected/0f0df841-c168-48f6-9e2e-f209a8216c52-kube-api-access-kxvjc\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461471 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-cni-netd\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2d99\" (UniqueName: \"kubernetes.io/projected/5df89727-eca2-4929-8ab4-9c1a7832889b-kube-api-access-p2d99\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461525 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-conf-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461548 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-run-netns\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-var-lib-kubelet\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-device-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461628 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovnkube-config\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461646 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-kubernetes\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-host\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461689 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d618274-e61e-4ac5-b98d-0316d3addc15-tmp-dir\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461704 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-etc-kubernetes\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.461758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461737 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysctl-d\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461755 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-lib-modules\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d-agent-certs\") pod \"konnectivity-agent-gkcvh\" (UID: \"59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d\") " pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461785 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461834 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-run\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-systemd-units\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461905 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-var-lib-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461957 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovn-node-metrics-cert\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.461988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462016 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cnibin\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462054 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-system-cni-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462093 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-cnibin\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462121 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc7kr\" (UniqueName: \"kubernetes.io/projected/2bee9e68-7e05-489d-adaf-1e469041f7c1-kube-api-access-fc7kr\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462141 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc7h9\" (UniqueName: \"kubernetes.io/projected/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-kube-api-access-jc7h9\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462162 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldqsk\" (UniqueName: \"kubernetes.io/projected/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-kube-api-access-ldqsk\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.462324 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462179 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-cni-bin\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462196 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f0df841-c168-48f6-9e2e-f209a8216c52-host-slash\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462224 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-etc-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462262 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-log-socket\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462289 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462319 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-modprobe-d\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462349 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-host\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462376 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-tmp\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462401 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-sys-fs\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462482 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-cni-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462529 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-slash\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysconfig\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462598 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d-konnectivity-ca\") pod \"konnectivity-agent-gkcvh\" (UID: \"59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d\") " pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-socket-dir-parent\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.463021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462647 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-daemon-config\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f0df841-c168-48f6-9e2e-f209a8216c52-iptables-alerter-script\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462697 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-kubelet\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462720 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-env-overrides\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462743 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-systemd\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-socket-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462788 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-os-release\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462812 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462836 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-cni-binary-copy\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462857 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-hostroot\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462881 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmr2x\" (UniqueName: \"kubernetes.io/projected/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-kube-api-access-cmr2x\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-node-log\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462948 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw85s\" (UniqueName: \"kubernetes.io/projected/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-kube-api-access-zw85s\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.463552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.462964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d618274-e61e-4ac5-b98d-0316d3addc15-hosts-file\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.476445 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.476428 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:32.502917 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.502890 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-s9kmt" Apr 22 19:23:32.511872 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.511844 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-s9kmt" Apr 22 19:23:32.563697 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d618274-e61e-4ac5-b98d-0316d3addc15-hosts-file\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.563916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563707 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-os-release\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.563916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-ovn\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.563916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563778 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-ovn\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.563916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-os-release\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.563916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d618274-e61e-4ac5-b98d-0316d3addc15-hosts-file\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.563916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563824 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-cni-bin\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.563916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysctl-conf\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.563916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-cni-bin\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563882 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-registration-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563950 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysctl-conf\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4k2\" (UniqueName: \"kubernetes.io/projected/9d618274-e61e-4ac5-b98d-0316d3addc15-kube-api-access-xk4k2\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-registration-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.563980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovnkube-script-lib\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564028 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-sys\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564056 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-multus-certs\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-k8s-cni-cncf-io\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564110 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-multus-certs\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564136 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-cni-multus\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564156 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-k8s-cni-cncf-io\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564156 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-sys\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-systemd\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564196 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-cni-multus\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-run-systemd\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.564302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564232 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-netns\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564257 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-kubelet\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564281 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-tuned\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564305 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-serviceca\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-run-netns\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564311 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-kubelet\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564329 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-system-cni-dir\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g9gp\" (UniqueName: \"kubernetes.io/projected/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-kube-api-access-8g9gp\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-system-cni-dir\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvjc\" (UniqueName: \"kubernetes.io/projected/0f0df841-c168-48f6-9e2e-f209a8216c52-kube-api-access-kxvjc\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-cni-netd\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2d99\" (UniqueName: \"kubernetes.io/projected/5df89727-eca2-4929-8ab4-9c1a7832889b-kube-api-access-p2d99\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-conf-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564545 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-run-netns\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564639 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-var-lib-kubelet\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564665 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-device-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.565128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564674 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovnkube-script-lib\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564693 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564720 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovnkube-config\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-kubernetes\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564730 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564771 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-host\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-conf-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d618274-e61e-4ac5-b98d-0316d3addc15-tmp-dir\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564828 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-etc-kubernetes\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564840 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-device-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564855 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysctl-d\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564880 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-lib-modules\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-serviceca\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-cni-netd\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564910 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-var-lib-kubelet\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d-agent-certs\") pod \"konnectivity-agent-gkcvh\" (UID: \"59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d\") " pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564946 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-host\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.565887 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.564980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-run\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565002 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565017 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-run-netns\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565032 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-systemd-units\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-var-lib-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-run\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565082 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovn-node-metrics-cert\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cnibin\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-system-cni-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-cnibin\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7kr\" (UniqueName: \"kubernetes.io/projected/2bee9e68-7e05-489d-adaf-1e469041f7c1-kube-api-access-fc7kr\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565314 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jc7h9\" (UniqueName: \"kubernetes.io/projected/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-kube-api-access-jc7h9\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565337 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldqsk\" (UniqueName: \"kubernetes.io/projected/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-kube-api-access-ldqsk\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-cni-bin\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-system-cni-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.566710 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565382 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f0df841-c168-48f6-9e2e-f209a8216c52-host-slash\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565419 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f0df841-c168-48f6-9e2e-f209a8216c52-host-slash\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565430 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-etc-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565459 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-log-socket\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565470 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-cnibin\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565476 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565485 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565599 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565639 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cnibin\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565671 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-etc-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-log-socket\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565743 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-host-var-lib-cni-bin\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565773 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-modprobe-d\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-kubernetes\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565798 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-host\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-tmp\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.567534 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-host\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-sys-fs\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-systemd-units\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565898 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565948 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-modprobe-d\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.565970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566001 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-cni-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566028 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-slash\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566089 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysconfig\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566116 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d-konnectivity-ca\") pod \"konnectivity-agent-gkcvh\" (UID: \"59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d\") " pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566141 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-socket-dir-parent\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566167 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-daemon-config\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566204 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f0df841-c168-48f6-9e2e-f209a8216c52-iptables-alerter-script\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566223 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovnkube-config\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-kubelet\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-env-overrides\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-systemd\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.568278 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566297 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-var-lib-openvswitch\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-socket-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566336 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-os-release\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566339 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-etc-kubernetes\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566362 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-sys-fs\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-cni-binary-copy\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-hostroot\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmr2x\" (UniqueName: \"kubernetes.io/projected/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-kube-api-access-cmr2x\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-node-log\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566492 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566511 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d618274-e61e-4ac5-b98d-0316d3addc15-tmp-dir\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566517 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw85s\" (UniqueName: \"kubernetes.io/projected/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-kube-api-access-zw85s\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566584 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-hostroot\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysctl-d\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566693 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566883 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-lib-modules\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.569118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566928 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-kubelet\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.566965 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-cni-binary-copy\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567034 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-systemd\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567124 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-socket-dir\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567199 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-os-release\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:32.567287 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bee9e68-7e05-489d-adaf-1e469041f7c1-env-overrides\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:32.567367 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.067336811 +0000 UTC m=+2.117700296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567363 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-daemon-config\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567414 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-node-log\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567437 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567551 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-sysconfig\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bee9e68-7e05-489d-adaf-1e469041f7c1-host-slash\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567637 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-cni-dir\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567682 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-multus-socket-dir-parent\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.567963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f0df841-c168-48f6-9e2e-f209a8216c52-iptables-alerter-script\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.569672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.568056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d-konnectivity-ca\") pod \"konnectivity-agent-gkcvh\" (UID: \"59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d\") " pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:32.570150 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.568683 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bee9e68-7e05-489d-adaf-1e469041f7c1-ovn-node-metrics-cert\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.570150 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.568852 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d-agent-certs\") pod \"konnectivity-agent-gkcvh\" (UID: \"59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d\") " pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:32.570150 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.569147 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-etc-tuned\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.570150 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.569903 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-tmp\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.581309 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.581285 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4k2\" (UniqueName: \"kubernetes.io/projected/9d618274-e61e-4ac5-b98d-0316d3addc15-kube-api-access-xk4k2\") pod \"node-resolver-8w5lr\" (UID: \"9d618274-e61e-4ac5-b98d-0316d3addc15\") " pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.583078 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.583045 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldqsk\" (UniqueName: \"kubernetes.io/projected/3b3a8d77-2840-4166-a03a-e49d2f4f7de6-kube-api-access-ldqsk\") pod \"node-ca-6gbdc\" (UID: \"3b3a8d77-2840-4166-a03a-e49d2f4f7de6\") " pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.583638 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.583612 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc7h9\" (UniqueName: \"kubernetes.io/projected/0fd5cbd4-05ed-40a7-b22b-598f8e90a635-kube-api-access-jc7h9\") pod \"tuned-8cg7f\" (UID: \"0fd5cbd4-05ed-40a7-b22b-598f8e90a635\") " pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.585338 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:32.585321 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:32.585398 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:32.585343 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:32.585398 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:32.585356 2570 projected.go:194] Error preparing data for projected volume kube-api-access-84thh for pod openshift-network-diagnostics/network-check-target-mch7s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:32.585463 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:32.585429 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh podName:d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.085410222 +0000 UTC m=+2.135773691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-84thh" (UniqueName: "kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh") pod "network-check-target-mch7s" (UID: "d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:32.587282 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.587258 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g9gp\" (UniqueName: \"kubernetes.io/projected/55c68bde-7e46-4a89-a5ae-8a4047fde6e7-kube-api-access-8g9gp\") pod \"multus-additional-cni-plugins-kxfcc\" (UID: \"55c68bde-7e46-4a89-a5ae-8a4047fde6e7\") " pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.588743 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.588721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw85s\" (UniqueName: \"kubernetes.io/projected/2b85cffd-a88d-4c1f-bca8-1aa201cd64b3-kube-api-access-zw85s\") pod \"aws-ebs-csi-driver-node-f5w5l\" (UID: \"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.589322 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.589307 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmr2x\" (UniqueName: \"kubernetes.io/projected/9b64e2bc-78bb-4188-ae32-9f0e5f92f75a-kube-api-access-cmr2x\") pod \"multus-qnx9f\" (UID: \"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a\") " pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.594689 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.594638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2d99\" (UniqueName: \"kubernetes.io/projected/5df89727-eca2-4929-8ab4-9c1a7832889b-kube-api-access-p2d99\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:32.594824 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.594807 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc7kr\" (UniqueName: \"kubernetes.io/projected/2bee9e68-7e05-489d-adaf-1e469041f7c1-kube-api-access-fc7kr\") pod \"ovnkube-node-cszr4\" (UID: \"2bee9e68-7e05-489d-adaf-1e469041f7c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.597649 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.597628 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvjc\" (UniqueName: \"kubernetes.io/projected/0f0df841-c168-48f6-9e2e-f209a8216c52-kube-api-access-kxvjc\") pod \"iptables-alerter-mgsvn\" (UID: \"0f0df841-c168-48f6-9e2e-f209a8216c52\") " pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.602875 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.602860 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" Apr 22 19:23:32.610256 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.610239 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6gbdc" Apr 22 19:23:32.622714 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.622697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" Apr 22 19:23:32.676370 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.676345 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode33fdcedf15ae63b7d7592c581d6241e.slice/crio-22578c87832c28e2d81286591fc4502e7aa2aefda7d895674328480baaa940de WatchSource:0}: Error finding container 22578c87832c28e2d81286591fc4502e7aa2aefda7d895674328480baaa940de: Status 404 returned error can't find the container with id 22578c87832c28e2d81286591fc4502e7aa2aefda7d895674328480baaa940de Apr 22 19:23:32.676860 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.676845 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda12817d60fd512a01cb9024b96513bd3.slice/crio-da0087081f0a118b8a3e83a2aba70243f067c0a239645ba9688203035c2d288b WatchSource:0}: Error finding container da0087081f0a118b8a3e83a2aba70243f067c0a239645ba9688203035c2d288b: Status 404 returned error can't find the container with id da0087081f0a118b8a3e83a2aba70243f067c0a239645ba9688203035c2d288b Apr 22 19:23:32.681027 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.681011 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:32.776430 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.776404 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:32.782119 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.782093 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a9ae74_55c2_42a6_b0ec_5e69fea1ba3d.slice/crio-f817770e9d4c143fc0c77dde21a7d4223126def945e8e5fad83b7d3a0192a4f6 WatchSource:0}: Error finding container f817770e9d4c143fc0c77dde21a7d4223126def945e8e5fad83b7d3a0192a4f6: Status 404 returned error can't find the container with id f817770e9d4c143fc0c77dde21a7d4223126def945e8e5fad83b7d3a0192a4f6 Apr 22 19:23:32.796559 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.796537 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" Apr 22 19:23:32.802198 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.802176 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8w5lr" Apr 22 19:23:32.803527 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.803500 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b85cffd_a88d_4c1f_bca8_1aa201cd64b3.slice/crio-63b557b48e6b8a7ff18803cdf125b0c014449b98ef4e3933b5f906e17a0dca43 WatchSource:0}: Error finding container 63b557b48e6b8a7ff18803cdf125b0c014449b98ef4e3933b5f906e17a0dca43: Status 404 returned error can't find the container with id 63b557b48e6b8a7ff18803cdf125b0c014449b98ef4e3933b5f906e17a0dca43 Apr 22 19:23:32.808950 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.808925 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d618274_e61e_4ac5_b98d_0316d3addc15.slice/crio-ffebcd53c646300bca23f4d2921572ca3a8d37133f64ae6147a0369f3ded6b91 WatchSource:0}: Error finding container ffebcd53c646300bca23f4d2921572ca3a8d37133f64ae6147a0369f3ded6b91: Status 404 returned error can't find the container with id ffebcd53c646300bca23f4d2921572ca3a8d37133f64ae6147a0369f3ded6b91 Apr 22 19:23:32.840916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.840886 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qnx9f" Apr 22 19:23:32.847270 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.847242 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b64e2bc_78bb_4188_ae32_9f0e5f92f75a.slice/crio-19a3b635c086d1b8ba400d6f2741bbe2b26a6882c77e1da88c05bd67760e564a WatchSource:0}: Error finding container 19a3b635c086d1b8ba400d6f2741bbe2b26a6882c77e1da88c05bd67760e564a: Status 404 returned error can't find the container with id 19a3b635c086d1b8ba400d6f2741bbe2b26a6882c77e1da88c05bd67760e564a Apr 22 19:23:32.854304 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.854267 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mgsvn" Apr 22 19:23:32.860175 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.860152 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f0df841_c168_48f6_9e2e_f209a8216c52.slice/crio-cf186373c22bc0b9f807f9e7e30e82341d5327f20d6c6eca292c65bbca99b041 WatchSource:0}: Error finding container cf186373c22bc0b9f807f9e7e30e82341d5327f20d6c6eca292c65bbca99b041: Status 404 returned error can't find the container with id cf186373c22bc0b9f807f9e7e30e82341d5327f20d6c6eca292c65bbca99b041 Apr 22 19:23:32.877361 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:32.877336 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:32.883493 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.883470 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bee9e68_7e05_489d_adaf_1e469041f7c1.slice/crio-087607ae1d2413a66f9c8b600b6261a30ad3abb25969e95dd644bbc5d3102c59 WatchSource:0}: Error finding container 087607ae1d2413a66f9c8b600b6261a30ad3abb25969e95dd644bbc5d3102c59: Status 404 returned error can't find the container with id 087607ae1d2413a66f9c8b600b6261a30ad3abb25969e95dd644bbc5d3102c59 Apr 22 19:23:32.931700 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.931671 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c68bde_7e46_4a89_a5ae_8a4047fde6e7.slice/crio-d91198700d5d7c71308aac8d6d01a4b66177f56eed3fb3ab7106be808cfd8c7f WatchSource:0}: Error finding container d91198700d5d7c71308aac8d6d01a4b66177f56eed3fb3ab7106be808cfd8c7f: Status 404 returned error can't find the container with id d91198700d5d7c71308aac8d6d01a4b66177f56eed3fb3ab7106be808cfd8c7f Apr 22 19:23:32.965406 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.965378 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fd5cbd4_05ed_40a7_b22b_598f8e90a635.slice/crio-44db70ce5f3121a5452512f21338e666f9668565549a400aeb55b5e0d57b5e71 WatchSource:0}: Error finding container 44db70ce5f3121a5452512f21338e666f9668565549a400aeb55b5e0d57b5e71: Status 404 returned error can't find the container with id 44db70ce5f3121a5452512f21338e666f9668565549a400aeb55b5e0d57b5e71 Apr 22 19:23:32.966046 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:23:32.966018 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b3a8d77_2840_4166_a03a_e49d2f4f7de6.slice/crio-7bd614420486681f2452574dc64c186d0d18ba264e34c211faeea7278836a08e WatchSource:0}: Error finding container 7bd614420486681f2452574dc64c186d0d18ba264e34c211faeea7278836a08e: Status 404 returned error can't find the container with id 7bd614420486681f2452574dc64c186d0d18ba264e34c211faeea7278836a08e Apr 22 19:23:33.069978 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.069879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:33.070200 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:33.070038 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:33.070200 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:33.070111 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.070091844 +0000 UTC m=+3.120455328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:33.170957 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.170868 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:33.171116 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:33.171070 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:33.171116 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:33.171094 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:33.171116 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:33.171108 2570 projected.go:194] Error preparing data for projected volume kube-api-access-84thh for pod openshift-network-diagnostics/network-check-target-mch7s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:33.171282 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:33.171188 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh podName:d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.171168118 +0000 UTC m=+3.221531590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-84thh" (UniqueName: "kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh") pod "network-check-target-mch7s" (UID: "d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:33.514758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.514668 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:32 +0000 UTC" deadline="2027-10-26 13:44:40.236773533 +0000 UTC" Apr 22 19:23:33.514758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.514755 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13242h21m6.722022411s" Apr 22 19:23:33.625195 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.625125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6gbdc" event={"ID":"3b3a8d77-2840-4166-a03a-e49d2f4f7de6","Type":"ContainerStarted","Data":"7bd614420486681f2452574dc64c186d0d18ba264e34c211faeea7278836a08e"} Apr 22 19:23:33.648590 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.648532 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" event={"ID":"0fd5cbd4-05ed-40a7-b22b-598f8e90a635","Type":"ContainerStarted","Data":"44db70ce5f3121a5452512f21338e666f9668565549a400aeb55b5e0d57b5e71"} Apr 22 19:23:33.661368 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.661145 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:33.670869 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.670834 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"087607ae1d2413a66f9c8b600b6261a30ad3abb25969e95dd644bbc5d3102c59"} Apr 22 19:23:33.694501 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.694463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mgsvn" event={"ID":"0f0df841-c168-48f6-9e2e-f209a8216c52","Type":"ContainerStarted","Data":"cf186373c22bc0b9f807f9e7e30e82341d5327f20d6c6eca292c65bbca99b041"} Apr 22 19:23:33.707530 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.707427 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnx9f" event={"ID":"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a","Type":"ContainerStarted","Data":"19a3b635c086d1b8ba400d6f2741bbe2b26a6882c77e1da88c05bd67760e564a"} Apr 22 19:23:33.722502 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.722472 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:33.748929 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.748876 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" event={"ID":"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3","Type":"ContainerStarted","Data":"63b557b48e6b8a7ff18803cdf125b0c014449b98ef4e3933b5f906e17a0dca43"} Apr 22 19:23:33.778440 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.778348 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gkcvh" event={"ID":"59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d","Type":"ContainerStarted","Data":"f817770e9d4c143fc0c77dde21a7d4223126def945e8e5fad83b7d3a0192a4f6"} Apr 22 19:23:33.810315 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.809951 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" event={"ID":"e33fdcedf15ae63b7d7592c581d6241e","Type":"ContainerStarted","Data":"22578c87832c28e2d81286591fc4502e7aa2aefda7d895674328480baaa940de"} Apr 22 19:23:33.820062 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.820020 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" event={"ID":"55c68bde-7e46-4a89-a5ae-8a4047fde6e7","Type":"ContainerStarted","Data":"d91198700d5d7c71308aac8d6d01a4b66177f56eed3fb3ab7106be808cfd8c7f"} Apr 22 19:23:33.830853 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.830819 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8w5lr" event={"ID":"9d618274-e61e-4ac5-b98d-0316d3addc15","Type":"ContainerStarted","Data":"ffebcd53c646300bca23f4d2921572ca3a8d37133f64ae6147a0369f3ded6b91"} Apr 22 19:23:33.853653 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.853616 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" event={"ID":"a12817d60fd512a01cb9024b96513bd3","Type":"ContainerStarted","Data":"da0087081f0a118b8a3e83a2aba70243f067c0a239645ba9688203035c2d288b"} Apr 22 19:23:33.880920 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:33.880610 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:34.077775 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:34.077691 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:34.077941 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:34.077844 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:34.077941 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:34.077907 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.077887809 +0000 UTC m=+5.128251283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:34.178590 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:34.178543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:34.178753 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:34.178711 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:34.178753 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:34.178733 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:34.178753 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:34.178747 2570 projected.go:194] Error preparing data for projected volume kube-api-access-84thh for pod openshift-network-diagnostics/network-check-target-mch7s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:34.178989 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:34.178804 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh podName:d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.178783431 +0000 UTC m=+5.229146902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-84thh" (UniqueName: "kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh") pod "network-check-target-mch7s" (UID: "d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:34.515698 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:34.515656 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:32 +0000 UTC" deadline="2027-12-08 11:35:57.255084451 +0000 UTC" Apr 22 19:23:34.515698 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:34.515697 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14272h12m22.739392047s" Apr 22 19:23:34.591746 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:34.590981 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:34.591746 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:34.591110 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:34.591746 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:34.591581 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:34.591746 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:34.591693 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:36.095981 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:36.095672 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:36.102796 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:36.096778 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:36.102796 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:36.096902 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:40.09684587 +0000 UTC m=+9.147209341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:36.198460 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:36.198422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:36.198711 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:36.198637 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:36.198711 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:36.198665 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:36.198711 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:36.198680 2570 projected.go:194] Error preparing data for projected volume kube-api-access-84thh for pod openshift-network-diagnostics/network-check-target-mch7s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:36.198960 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:36.198743 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh podName:d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:40.198721973 +0000 UTC m=+9.249085458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-84thh" (UniqueName: "kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh") pod "network-check-target-mch7s" (UID: "d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:36.591601 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:36.591558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:36.591784 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:36.591684 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:36.592034 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:36.592017 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:36.592118 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:36.592102 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:38.591739 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:38.591036 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:38.591739 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:38.591167 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:38.591739 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:38.591593 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:38.591739 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:38.591698 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:40.128323 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:40.128281 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:40.128815 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:40.128373 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:40.128815 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:40.128446 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.128426033 +0000 UTC m=+17.178789508 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:40.229693 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:40.229643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:40.229866 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:40.229827 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:40.229866 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:40.229854 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:40.229947 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:40.229871 2570 projected.go:194] Error preparing data for projected volume kube-api-access-84thh for pod openshift-network-diagnostics/network-check-target-mch7s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:40.229978 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:40.229945 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh podName:d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.229923558 +0000 UTC m=+17.280287039 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-84thh" (UniqueName: "kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh") pod "network-check-target-mch7s" (UID: "d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:40.591272 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:40.591235 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:40.591462 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:40.591373 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:40.591670 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:40.591651 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:40.591793 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:40.591763 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:42.591273 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:42.591237 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:42.591781 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:42.591252 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:42.591781 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:42.591391 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:42.591781 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:42.591498 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:44.590770 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:44.590733 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:44.591199 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:44.590734 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:44.591199 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:44.590881 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:44.591199 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:44.590976 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:46.591128 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:46.591053 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:46.591503 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:46.591053 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:46.591503 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:46.591149 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:46.591503 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:46.591271 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:48.186228 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:48.186198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:48.186719 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:48.186325 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:48.186719 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:48.186390 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.186372909 +0000 UTC m=+33.236736378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:48.287399 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:48.287367 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:48.287614 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:48.287547 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:48.287614 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:48.287586 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:48.287614 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:48.287599 2570 projected.go:194] Error preparing data for projected volume kube-api-access-84thh for pod openshift-network-diagnostics/network-check-target-mch7s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:48.287739 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:48.287646 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh podName:d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.28763192 +0000 UTC m=+33.337995392 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-84thh" (UniqueName: "kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh") pod "network-check-target-mch7s" (UID: "d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:48.590894 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:48.590864 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:48.591087 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:48.590864 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:48.591087 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:48.591004 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:48.591194 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:48.591091 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:50.590850 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:50.590809 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:50.590850 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:50.590851 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:50.591296 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:50.590926 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:50.591296 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:50.591006 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:51.895037 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.894671 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" event={"ID":"0fd5cbd4-05ed-40a7-b22b-598f8e90a635","Type":"ContainerStarted","Data":"6d2f79eddf15256a392158139ff3c53881341ffa2676d84f8b97a5067abc751c"} Apr 22 19:23:51.897426 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.897404 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:23:51.897800 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.897770 2570 generic.go:358] "Generic (PLEG): container finished" podID="2bee9e68-7e05-489d-adaf-1e469041f7c1" containerID="18d721571bfd4e6c3e7526c922402ecbf0ad3dc31b41c750e819c7cee2b1d00b" exitCode=1 Apr 22 19:23:51.897903 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.897888 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"1c9a7b99e2e76233881f2e181b34e9f3e70c0258e3cfdc1e031675bb61428106"} Apr 22 19:23:51.897956 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.897917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"06db20592e15d0a9dfd336af60a554278e11fdda92dcdde9a98a407c57a0efce"} Apr 22 19:23:51.897956 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.897932 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"84167da30a731e7de284fb8d1d965ffa2b656f3187f5d0675bb1ed7d20e6cfc2"} Apr 22 19:23:51.897956 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.897945 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"838b5126e242677c82cc1b211bc0f2f17534dd524a68d0ebc64453b90af837b5"} Apr 22 19:23:51.898110 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.897958 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerDied","Data":"18d721571bfd4e6c3e7526c922402ecbf0ad3dc31b41c750e819c7cee2b1d00b"} Apr 22 19:23:51.898110 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.897973 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"4262c9280ddaf56cc437cec620ccda42b478ea5375fa5af1d2a34d4a44d7b833"} Apr 22 19:23:51.899973 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.899881 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnx9f" event={"ID":"9b64e2bc-78bb-4188-ae32-9f0e5f92f75a","Type":"ContainerStarted","Data":"a436871a0f20e079599378e11c4178a6cab3c38fa1749758c43ce8734c39cec9"} Apr 22 19:23:51.901552 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.901533 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" event={"ID":"e33fdcedf15ae63b7d7592c581d6241e","Type":"ContainerStarted","Data":"6077b657a3f3747a5bbe667d2c391884c6291a0695ad0b215b31d0e8b9357d13"} Apr 22 19:23:51.913647 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.913544 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8cg7f" podStartSLOduration=2.791645203 podStartE2EDuration="20.913528589s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.967117935 +0000 UTC m=+2.017481403" lastFinishedPulling="2026-04-22 19:23:51.089001319 +0000 UTC m=+20.139364789" observedRunningTime="2026-04-22 19:23:51.912816713 +0000 UTC m=+20.963180203" watchObservedRunningTime="2026-04-22 19:23:51.913528589 +0000 UTC m=+20.963892081" Apr 22 19:23:51.931909 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.931855 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qnx9f" podStartSLOduration=2.672711933 podStartE2EDuration="20.931836381s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.84884959 +0000 UTC m=+1.899213073" lastFinishedPulling="2026-04-22 19:23:51.10797404 +0000 UTC m=+20.158337521" observedRunningTime="2026-04-22 19:23:51.931169471 +0000 UTC m=+20.981533009" watchObservedRunningTime="2026-04-22 19:23:51.931836381 +0000 UTC m=+20.982199872" Apr 22 19:23:51.945333 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:51.945287 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-132.ec2.internal" podStartSLOduration=20.945268846 podStartE2EDuration="20.945268846s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:51.944763201 +0000 UTC m=+20.995126691" watchObservedRunningTime="2026-04-22 19:23:51.945268846 +0000 UTC m=+20.995632336" Apr 22 19:23:52.590655 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.590616 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:52.590832 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.590622 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:52.590832 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:52.590744 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:52.590832 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:52.590814 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:52.904367 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.904332 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6gbdc" event={"ID":"3b3a8d77-2840-4166-a03a-e49d2f4f7de6","Type":"ContainerStarted","Data":"1d9157a9ec92d4f33ed4f877dd965808ceeca1958c8ea95d345919c555fa9217"} Apr 22 19:23:52.905654 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.905626 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mgsvn" event={"ID":"0f0df841-c168-48f6-9e2e-f209a8216c52","Type":"ContainerStarted","Data":"157cd1fc7b087d2ba379251a108ce80ed24019bbc7758aec1b9b79d49cfcd252"} Apr 22 19:23:52.906700 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.906678 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" event={"ID":"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3","Type":"ContainerStarted","Data":"277b84e350e235ddc37d5edfb94688d461e58b0451ea06127d600cec36111abc"} Apr 22 19:23:52.907809 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.907778 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gkcvh" event={"ID":"59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d","Type":"ContainerStarted","Data":"8d7498ad95b40779bfa770844cf91ee881a52bbd7f1162eaf258a59045605d84"} Apr 22 19:23:52.909083 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.909062 2570 generic.go:358] "Generic (PLEG): container finished" podID="55c68bde-7e46-4a89-a5ae-8a4047fde6e7" containerID="41ddf312ccc78c3e72ae3aaa3b739a404916cec4dc019e6524ebd82b2e80943d" exitCode=0 Apr 22 19:23:52.909156 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.909117 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" event={"ID":"55c68bde-7e46-4a89-a5ae-8a4047fde6e7","Type":"ContainerDied","Data":"41ddf312ccc78c3e72ae3aaa3b739a404916cec4dc019e6524ebd82b2e80943d"} Apr 22 19:23:52.910416 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.910351 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8w5lr" event={"ID":"9d618274-e61e-4ac5-b98d-0316d3addc15","Type":"ContainerStarted","Data":"35adf661c61d30db3c86eb8700c45434ab20229f1a8b488388b5ac586484e404"} Apr 22 19:23:52.911848 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.911824 2570 generic.go:358] "Generic (PLEG): container finished" podID="a12817d60fd512a01cb9024b96513bd3" containerID="32dff52fa47974ce118d63a28c24083742fa3343d7db87ee052f189a1b6c1966" exitCode=0 Apr 22 19:23:52.911939 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.911917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" event={"ID":"a12817d60fd512a01cb9024b96513bd3","Type":"ContainerDied","Data":"32dff52fa47974ce118d63a28c24083742fa3343d7db87ee052f189a1b6c1966"} Apr 22 19:23:52.919645 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.919608 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6gbdc" podStartSLOduration=3.8136969560000002 podStartE2EDuration="21.919596165s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.967613492 +0000 UTC m=+2.017976960" lastFinishedPulling="2026-04-22 19:23:51.073512695 +0000 UTC m=+20.123876169" observedRunningTime="2026-04-22 19:23:52.919317871 +0000 UTC m=+21.969681360" watchObservedRunningTime="2026-04-22 19:23:52.919596165 +0000 UTC m=+21.969959991" Apr 22 19:23:52.972183 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.972136 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gkcvh" podStartSLOduration=3.682325268 podStartE2EDuration="21.97211007s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.783760251 +0000 UTC m=+1.834123719" lastFinishedPulling="2026-04-22 19:23:51.07354505 +0000 UTC m=+20.123908521" observedRunningTime="2026-04-22 19:23:52.972070125 +0000 UTC m=+22.022433616" watchObservedRunningTime="2026-04-22 19:23:52.97211007 +0000 UTC m=+22.022473558" Apr 22 19:23:52.989343 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:52.989169 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8w5lr" podStartSLOduration=3.747915867 podStartE2EDuration="21.989147168s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.810403954 +0000 UTC m=+1.860767422" lastFinishedPulling="2026-04-22 19:23:51.05163525 +0000 UTC m=+20.101998723" observedRunningTime="2026-04-22 19:23:52.987887053 +0000 UTC m=+22.038250543" watchObservedRunningTime="2026-04-22 19:23:52.989147168 +0000 UTC m=+22.039510661" Apr 22 19:23:53.002601 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.002264 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mgsvn" podStartSLOduration=3.771229037 podStartE2EDuration="22.002244737s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.861482991 +0000 UTC m=+1.911846460" lastFinishedPulling="2026-04-22 19:23:51.092498678 +0000 UTC m=+20.142862160" observedRunningTime="2026-04-22 19:23:53.001870084 +0000 UTC m=+22.052233574" watchObservedRunningTime="2026-04-22 19:23:53.002244737 +0000 UTC m=+22.052608228" Apr 22 19:23:53.083857 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.083818 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:23:53.341541 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.341504 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:53.342325 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.342302 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:53.538069 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.537938 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:23:53.083839141Z","UUID":"d012f5c7-48be-4072-8698-90a8418278fd","Handler":null,"Name":"","Endpoint":""} Apr 22 19:23:53.541205 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.541174 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:23:53.541205 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.541208 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:23:53.917004 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.916360 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" event={"ID":"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3","Type":"ContainerStarted","Data":"d034135af3243c004027f2a3c64f288da9277e8ca388ff6a64c041b2c423ea4a"} Apr 22 19:23:53.919605 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.918942 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" event={"ID":"a12817d60fd512a01cb9024b96513bd3","Type":"ContainerStarted","Data":"53c125a4600dbaa213a4f1271efcdc81c385edd91a93e0b5ce3feac8e5002477"} Apr 22 19:23:53.919605 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.919548 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:53.920163 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.920136 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gkcvh" Apr 22 19:23:53.934871 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:53.934817 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-132.ec2.internal" podStartSLOduration=22.934794838 podStartE2EDuration="22.934794838s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:53.933476213 +0000 UTC m=+22.983839705" watchObservedRunningTime="2026-04-22 19:23:53.934794838 +0000 UTC m=+22.985158329" Apr 22 19:23:54.591388 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:54.591160 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:54.591581 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:54.591160 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:54.591581 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:54.591513 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:54.591581 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:54.591546 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:54.923895 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:54.923820 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:23:54.924335 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:54.924222 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"1d00c81b71630b377c9f34f4e954e8632258e6d3cb5ec4c7a4b71c5796529e97"} Apr 22 19:23:54.926675 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:54.926368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" event={"ID":"2b85cffd-a88d-4c1f-bca8-1aa201cd64b3","Type":"ContainerStarted","Data":"914ededf231a581ae8e41d51ba371139f9d5c2ef181bbfde66fb00621f58c41c"} Apr 22 19:23:54.946774 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:54.946703 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5w5l" podStartSLOduration=2.868479905 podStartE2EDuration="23.946689291s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.80527304 +0000 UTC m=+1.855636508" lastFinishedPulling="2026-04-22 19:23:53.883482422 +0000 UTC m=+22.933845894" observedRunningTime="2026-04-22 19:23:54.946529898 +0000 UTC m=+23.996893421" watchObservedRunningTime="2026-04-22 19:23:54.946689291 +0000 UTC m=+23.997052812" Apr 22 19:23:56.591391 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.591348 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:56.592049 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:56.591491 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:56.592049 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.591552 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:56.592049 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:56.591679 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:56.933841 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.933650 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:23:56.935097 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.935061 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"222ae1b207e5c52488e7e3e4ff100230f545c188cb307f7dbdeedded61eaa115"} Apr 22 19:23:56.936171 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.935686 2570 scope.go:117] "RemoveContainer" containerID="18d721571bfd4e6c3e7526c922402ecbf0ad3dc31b41c750e819c7cee2b1d00b" Apr 22 19:23:56.936171 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.936088 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:56.936354 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.936211 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:56.955734 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.955689 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:56.956854 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:56.956816 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:57.940824 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:57.940794 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:23:57.941519 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:57.941287 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" event={"ID":"2bee9e68-7e05-489d-adaf-1e469041f7c1","Type":"ContainerStarted","Data":"99f852da320ec458aeabbdcd8ddc0ae43160e302741c2573e50a9af2d7f4c3bb"} Apr 22 19:23:57.941519 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:57.941406 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:23:57.943179 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:57.943154 2570 generic.go:358] "Generic (PLEG): container finished" podID="55c68bde-7e46-4a89-a5ae-8a4047fde6e7" containerID="fa898afadec8ad40b661dd692203330d6da3de919e433c5297443c0ef17ce310" exitCode=0 Apr 22 19:23:57.943287 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:57.943201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" event={"ID":"55c68bde-7e46-4a89-a5ae-8a4047fde6e7","Type":"ContainerDied","Data":"fa898afadec8ad40b661dd692203330d6da3de919e433c5297443c0ef17ce310"} Apr 22 19:23:57.971251 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:57.971200 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" podStartSLOduration=8.464343121 podStartE2EDuration="26.971187014s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.884983953 +0000 UTC m=+1.935347420" lastFinishedPulling="2026-04-22 19:23:51.391827841 +0000 UTC m=+20.442191313" observedRunningTime="2026-04-22 19:23:57.969853906 +0000 UTC m=+27.020217395" watchObservedRunningTime="2026-04-22 19:23:57.971187014 +0000 UTC m=+27.021550504" Apr 22 19:23:58.592558 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.591941 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:58.592558 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:58.592088 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:58.592558 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.591941 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:58.592558 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:58.592518 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:58.752797 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.752558 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:23:58.783812 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.783753 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mch7s"] Apr 22 19:23:58.789736 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.789711 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gwm2k"] Apr 22 19:23:58.946513 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.946479 2570 generic.go:358] "Generic (PLEG): container finished" podID="55c68bde-7e46-4a89-a5ae-8a4047fde6e7" containerID="7f196b3e43bf91f0d4de6ed5b1d7ef2d8d446cc4cace3c09e1cad350e962e369" exitCode=0 Apr 22 19:23:58.946930 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.946538 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" event={"ID":"55c68bde-7e46-4a89-a5ae-8a4047fde6e7","Type":"ContainerDied","Data":"7f196b3e43bf91f0d4de6ed5b1d7ef2d8d446cc4cace3c09e1cad350e962e369"} Apr 22 19:23:58.946930 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.946728 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:23:58.946930 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:58.946822 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:23:58.946930 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:58.946884 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:23:58.947136 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:23:58.946982 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:23:59.950182 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:59.950144 2570 generic.go:358] "Generic (PLEG): container finished" podID="55c68bde-7e46-4a89-a5ae-8a4047fde6e7" containerID="9a13277eeb8990d0e30bf8ac26d58e4f18fb6ed29fe70f8a6c6ecd7eb1dca509" exitCode=0 Apr 22 19:23:59.950758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:23:59.950224 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" event={"ID":"55c68bde-7e46-4a89-a5ae-8a4047fde6e7","Type":"ContainerDied","Data":"9a13277eeb8990d0e30bf8ac26d58e4f18fb6ed29fe70f8a6c6ecd7eb1dca509"} Apr 22 19:24:00.591562 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:00.591520 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:24:00.591770 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:00.591667 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:24:00.591770 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:00.591726 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:24:00.591881 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:00.591830 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:24:02.591153 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:02.591123 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:24:02.591917 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:02.591123 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:24:02.591917 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:02.591252 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:24:02.591917 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:02.591371 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mch7s" podUID="d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8" Apr 22 19:24:04.206425 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.206373 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:24:04.206921 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.206506 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:04.206921 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.206596 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.20655799 +0000 UTC m=+65.256921465 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:04.255466 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.255416 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-132.ec2.internal" event="NodeReady" Apr 22 19:24:04.255666 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.255592 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:24:04.306940 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.306907 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:24:04.307117 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.307081 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:04.307117 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.307108 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:04.307236 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.307120 2570 projected.go:194] Error preparing data for projected volume kube-api-access-84thh for pod openshift-network-diagnostics/network-check-target-mch7s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:04.307236 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.307192 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh podName:d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.307172204 +0000 UTC m=+65.357535687 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-84thh" (UniqueName: "kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh") pod "network-check-target-mch7s" (UID: "d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:04.357568 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.357529 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-95lfv"] Apr 22 19:24:04.390377 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.390343 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8vz9d"] Apr 22 19:24:04.390598 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.390558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.397337 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.397315 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8zlb\"" Apr 22 19:24:04.397769 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.397744 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:24:04.397873 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.397752 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:24:04.407218 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.407192 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-95lfv"] Apr 22 19:24:04.407218 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.407222 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8vz9d"] Apr 22 19:24:04.407377 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.407357 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:04.412741 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.412694 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:24:04.412741 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.412721 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:24:04.412974 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.412856 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8fgp\"" Apr 22 19:24:04.412974 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.412903 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:24:04.508033 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.507948 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:04.508192 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.508044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.508192 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.508141 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/638b7915-fc24-4304-b691-5e2dd5b5a7ce-config-volume\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.508300 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.508216 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/638b7915-fc24-4304-b691-5e2dd5b5a7ce-tmp-dir\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.508300 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.508252 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwp5f\" (UniqueName: \"kubernetes.io/projected/638b7915-fc24-4304-b691-5e2dd5b5a7ce-kube-api-access-qwp5f\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.508300 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.508283 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5552d\" (UniqueName: \"kubernetes.io/projected/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-kube-api-access-5552d\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:04.591492 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.591453 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:24:04.591690 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.591457 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:24:04.595423 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.595364 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:04.595585 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.595469 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l6zl6\"" Apr 22 19:24:04.595585 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.595496 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:04.595585 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.595544 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kmhxv\"" Apr 22 19:24:04.595585 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.595551 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:04.609157 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.609126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/638b7915-fc24-4304-b691-5e2dd5b5a7ce-tmp-dir\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.609283 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.609175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwp5f\" (UniqueName: \"kubernetes.io/projected/638b7915-fc24-4304-b691-5e2dd5b5a7ce-kube-api-access-qwp5f\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.609283 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.609206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5552d\" (UniqueName: \"kubernetes.io/projected/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-kube-api-access-5552d\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:04.609283 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.609270 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:04.609421 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.609295 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.609421 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.609387 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:04.609421 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.609410 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:04.609518 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.609427 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/638b7915-fc24-4304-b691-5e2dd5b5a7ce-tmp-dir\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.609518 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.609448 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls podName:638b7915-fc24-4304-b691-5e2dd5b5a7ce nodeName:}" failed. No retries permitted until 2026-04-22 19:24:05.109429566 +0000 UTC m=+34.159793049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls") pod "dns-default-95lfv" (UID: "638b7915-fc24-4304-b691-5e2dd5b5a7ce") : secret "dns-default-metrics-tls" not found Apr 22 19:24:04.609518 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:04.609466 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert podName:44c36a5b-c1fd-4922-93c7-7d7e2ee8797e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:05.109457066 +0000 UTC m=+34.159820536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert") pod "ingress-canary-8vz9d" (UID: "44c36a5b-c1fd-4922-93c7-7d7e2ee8797e") : secret "canary-serving-cert" not found Apr 22 19:24:04.609715 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.609522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/638b7915-fc24-4304-b691-5e2dd5b5a7ce-config-volume\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.610009 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.609985 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/638b7915-fc24-4304-b691-5e2dd5b5a7ce-config-volume\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.625115 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.625087 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwp5f\" (UniqueName: \"kubernetes.io/projected/638b7915-fc24-4304-b691-5e2dd5b5a7ce-kube-api-access-qwp5f\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:04.625247 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:04.625163 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5552d\" (UniqueName: \"kubernetes.io/projected/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-kube-api-access-5552d\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:05.112178 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:05.112145 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:05.112178 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:05.112185 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:05.112406 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:05.112319 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:05.112406 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:05.112327 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:05.112406 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:05.112377 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls podName:638b7915-fc24-4304-b691-5e2dd5b5a7ce nodeName:}" failed. No retries permitted until 2026-04-22 19:24:06.112359082 +0000 UTC m=+35.162722557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls") pod "dns-default-95lfv" (UID: "638b7915-fc24-4304-b691-5e2dd5b5a7ce") : secret "dns-default-metrics-tls" not found Apr 22 19:24:05.112406 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:05.112396 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert podName:44c36a5b-c1fd-4922-93c7-7d7e2ee8797e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:06.1123871 +0000 UTC m=+35.162750572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert") pod "ingress-canary-8vz9d" (UID: "44c36a5b-c1fd-4922-93c7-7d7e2ee8797e") : secret "canary-serving-cert" not found Apr 22 19:24:06.118938 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:06.118895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:06.118938 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:06.118943 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:06.119735 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:06.119045 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:06.119735 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:06.119072 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:06.119735 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:06.119113 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls podName:638b7915-fc24-4304-b691-5e2dd5b5a7ce nodeName:}" failed. No retries permitted until 2026-04-22 19:24:08.119096072 +0000 UTC m=+37.169459541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls") pod "dns-default-95lfv" (UID: "638b7915-fc24-4304-b691-5e2dd5b5a7ce") : secret "dns-default-metrics-tls" not found Apr 22 19:24:06.119735 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:06.119129 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert podName:44c36a5b-c1fd-4922-93c7-7d7e2ee8797e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:08.119121832 +0000 UTC m=+37.169485299 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert") pod "ingress-canary-8vz9d" (UID: "44c36a5b-c1fd-4922-93c7-7d7e2ee8797e") : secret "canary-serving-cert" not found Apr 22 19:24:06.967980 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:06.967950 2570 generic.go:358] "Generic (PLEG): container finished" podID="55c68bde-7e46-4a89-a5ae-8a4047fde6e7" containerID="28e2d40b4dc8deec8658dfc334d1e1dee20a0cbcee44019882828aca192f5cc3" exitCode=0 Apr 22 19:24:06.968140 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:06.967992 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" event={"ID":"55c68bde-7e46-4a89-a5ae-8a4047fde6e7","Type":"ContainerDied","Data":"28e2d40b4dc8deec8658dfc334d1e1dee20a0cbcee44019882828aca192f5cc3"} Apr 22 19:24:07.972967 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:07.972781 2570 generic.go:358] "Generic (PLEG): container finished" podID="55c68bde-7e46-4a89-a5ae-8a4047fde6e7" containerID="76f611df036b75f9f53f2f2c61758d7277a7937b8914ef9baa1c6b7ec53fe804" exitCode=0 Apr 22 19:24:07.972967 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:07.972866 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" event={"ID":"55c68bde-7e46-4a89-a5ae-8a4047fde6e7","Type":"ContainerDied","Data":"76f611df036b75f9f53f2f2c61758d7277a7937b8914ef9baa1c6b7ec53fe804"} Apr 22 19:24:08.132526 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:08.132496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:08.132526 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:08.132532 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:08.132771 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:08.132635 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:08.132771 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:08.132650 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:08.132771 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:08.132699 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls podName:638b7915-fc24-4304-b691-5e2dd5b5a7ce nodeName:}" failed. No retries permitted until 2026-04-22 19:24:12.132679946 +0000 UTC m=+41.183043430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls") pod "dns-default-95lfv" (UID: "638b7915-fc24-4304-b691-5e2dd5b5a7ce") : secret "dns-default-metrics-tls" not found Apr 22 19:24:08.132771 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:08.132717 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert podName:44c36a5b-c1fd-4922-93c7-7d7e2ee8797e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:12.132708112 +0000 UTC m=+41.183071580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert") pod "ingress-canary-8vz9d" (UID: "44c36a5b-c1fd-4922-93c7-7d7e2ee8797e") : secret "canary-serving-cert" not found Apr 22 19:24:08.978315 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:08.978285 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" event={"ID":"55c68bde-7e46-4a89-a5ae-8a4047fde6e7","Type":"ContainerStarted","Data":"754be0f404fa3e52e6703865a7790e50cc03a4a254bad7bce7d17bd9b8376dea"} Apr 22 19:24:12.160883 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:12.160848 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:12.160883 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:12.160885 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:12.161267 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:12.160991 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:12.161267 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:12.160993 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:12.161267 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:12.161044 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls podName:638b7915-fc24-4304-b691-5e2dd5b5a7ce nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.16102926 +0000 UTC m=+49.211392728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls") pod "dns-default-95lfv" (UID: "638b7915-fc24-4304-b691-5e2dd5b5a7ce") : secret "dns-default-metrics-tls" not found Apr 22 19:24:12.161267 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:12.161057 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert podName:44c36a5b-c1fd-4922-93c7-7d7e2ee8797e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.161051537 +0000 UTC m=+49.211415004 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert") pod "ingress-canary-8vz9d" (UID: "44c36a5b-c1fd-4922-93c7-7d7e2ee8797e") : secret "canary-serving-cert" not found Apr 22 19:24:20.216249 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:20.216210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:20.216249 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:20.216245 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:20.216805 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:20.216354 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:20.216805 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:20.216367 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:20.216805 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:20.216404 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls podName:638b7915-fc24-4304-b691-5e2dd5b5a7ce nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.216389429 +0000 UTC m=+65.266752897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls") pod "dns-default-95lfv" (UID: "638b7915-fc24-4304-b691-5e2dd5b5a7ce") : secret "dns-default-metrics-tls" not found Apr 22 19:24:20.216805 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:20.216443 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert podName:44c36a5b-c1fd-4922-93c7-7d7e2ee8797e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.216425468 +0000 UTC m=+65.266788939 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert") pod "ingress-canary-8vz9d" (UID: "44c36a5b-c1fd-4922-93c7-7d7e2ee8797e") : secret "canary-serving-cert" not found Apr 22 19:24:21.845805 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.845750 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kxfcc" podStartSLOduration=17.740336488 podStartE2EDuration="50.845732009s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:32.935529677 +0000 UTC m=+1.985893145" lastFinishedPulling="2026-04-22 19:24:06.040925183 +0000 UTC m=+35.091288666" observedRunningTime="2026-04-22 19:24:09.006171748 +0000 UTC m=+38.056535236" watchObservedRunningTime="2026-04-22 19:24:21.845732009 +0000 UTC m=+50.896095499" Apr 22 19:24:21.846252 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.845902 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5"] Apr 22 19:24:21.870121 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.870092 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4"] Apr 22 19:24:21.870281 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.870161 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" Apr 22 19:24:21.873312 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.873287 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:24:21.873878 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.873860 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:24:21.874751 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.874729 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:24:21.874860 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.874780 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-tft4l\"" Apr 22 19:24:21.874860 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.874809 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 19:24:21.889662 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.889636 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5"] Apr 22 19:24:21.889662 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.889660 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4"] Apr 22 19:24:21.889829 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.889752 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:21.892531 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.892506 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 19:24:21.892531 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.892518 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 19:24:21.892531 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.892533 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 19:24:21.892779 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:21.892657 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 19:24:22.030714 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.030682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.030888 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.030782 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.030888 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.030813 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.030888 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.030837 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-ca\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.030991 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.030900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-hub\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.030991 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.030916 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8bdx\" (UniqueName: \"kubernetes.io/projected/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-kube-api-access-d8bdx\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.030991 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.030939 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/376f0a5e-a45c-4ed3-bd9b-ac09d1b56541-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59b9cb474c-d5ng5\" (UID: \"376f0a5e-a45c-4ed3-bd9b-ac09d1b56541\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" Apr 22 19:24:22.030991 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.030959 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8dv\" (UniqueName: \"kubernetes.io/projected/376f0a5e-a45c-4ed3-bd9b-ac09d1b56541-kube-api-access-dr8dv\") pod \"managed-serviceaccount-addon-agent-59b9cb474c-d5ng5\" (UID: \"376f0a5e-a45c-4ed3-bd9b-ac09d1b56541\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" Apr 22 19:24:22.132310 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.132228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-hub\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.132310 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.132261 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8bdx\" (UniqueName: \"kubernetes.io/projected/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-kube-api-access-d8bdx\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.132310 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.132281 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/376f0a5e-a45c-4ed3-bd9b-ac09d1b56541-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59b9cb474c-d5ng5\" (UID: \"376f0a5e-a45c-4ed3-bd9b-ac09d1b56541\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" Apr 22 19:24:22.132310 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.132301 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8dv\" (UniqueName: \"kubernetes.io/projected/376f0a5e-a45c-4ed3-bd9b-ac09d1b56541-kube-api-access-dr8dv\") pod \"managed-serviceaccount-addon-agent-59b9cb474c-d5ng5\" (UID: \"376f0a5e-a45c-4ed3-bd9b-ac09d1b56541\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" Apr 22 19:24:22.132545 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.132426 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.132545 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.132495 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.132545 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.132515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.132545 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.132540 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-ca\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.133334 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.133309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.135829 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.135805 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-ca\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.135953 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.135871 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.135953 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.135915 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-hub\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.136205 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.136186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.136266 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.136200 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/376f0a5e-a45c-4ed3-bd9b-ac09d1b56541-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59b9cb474c-d5ng5\" (UID: \"376f0a5e-a45c-4ed3-bd9b-ac09d1b56541\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" Apr 22 19:24:22.147816 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.147788 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8dv\" (UniqueName: \"kubernetes.io/projected/376f0a5e-a45c-4ed3-bd9b-ac09d1b56541-kube-api-access-dr8dv\") pod \"managed-serviceaccount-addon-agent-59b9cb474c-d5ng5\" (UID: \"376f0a5e-a45c-4ed3-bd9b-ac09d1b56541\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" Apr 22 19:24:22.149589 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.149544 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8bdx\" (UniqueName: \"kubernetes.io/projected/adb06a7e-4d1f-4568-83a5-1faeb9a98fab-kube-api-access-d8bdx\") pod \"cluster-proxy-proxy-agent-888454476-wklt4\" (UID: \"adb06a7e-4d1f-4568-83a5-1faeb9a98fab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.189472 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.189436 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" Apr 22 19:24:22.198240 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.198206 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:24:22.351045 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.351016 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5"] Apr 22 19:24:22.354315 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:24:22.354286 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod376f0a5e_a45c_4ed3_bd9b_ac09d1b56541.slice/crio-213a1613e3cd07d20b7da8bca3a7353b33901c7eaff11089e126d6c1ec9ede87 WatchSource:0}: Error finding container 213a1613e3cd07d20b7da8bca3a7353b33901c7eaff11089e126d6c1ec9ede87: Status 404 returned error can't find the container with id 213a1613e3cd07d20b7da8bca3a7353b33901c7eaff11089e126d6c1ec9ede87 Apr 22 19:24:22.365287 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:22.365264 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4"] Apr 22 19:24:22.370069 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:24:22.370039 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadb06a7e_4d1f_4568_83a5_1faeb9a98fab.slice/crio-3f02e27255ed8013f07271135c9ada21f06f0850b99fa59b0cd842c15ada7eb5 WatchSource:0}: Error finding container 3f02e27255ed8013f07271135c9ada21f06f0850b99fa59b0cd842c15ada7eb5: Status 404 returned error can't find the container with id 3f02e27255ed8013f07271135c9ada21f06f0850b99fa59b0cd842c15ada7eb5 Apr 22 19:24:23.006453 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:23.006417 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" event={"ID":"adb06a7e-4d1f-4568-83a5-1faeb9a98fab","Type":"ContainerStarted","Data":"3f02e27255ed8013f07271135c9ada21f06f0850b99fa59b0cd842c15ada7eb5"} Apr 22 19:24:23.007610 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:23.007566 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" event={"ID":"376f0a5e-a45c-4ed3-bd9b-ac09d1b56541","Type":"ContainerStarted","Data":"213a1613e3cd07d20b7da8bca3a7353b33901c7eaff11089e126d6c1ec9ede87"} Apr 22 19:24:26.014780 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:26.014743 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" event={"ID":"adb06a7e-4d1f-4568-83a5-1faeb9a98fab","Type":"ContainerStarted","Data":"53f34ce3baa6525844f24f8de91d457d3a20821495406128fe5712c87838b099"} Apr 22 19:24:26.015898 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:26.015874 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" event={"ID":"376f0a5e-a45c-4ed3-bd9b-ac09d1b56541","Type":"ContainerStarted","Data":"1e9d93b9c6c999555a5c0f91e23d141774eddda61760a1166720fc001e05d860"} Apr 22 19:24:26.031878 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:26.031830 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59b9cb474c-d5ng5" podStartSLOduration=1.8332156419999999 podStartE2EDuration="5.031818213s" podCreationTimestamp="2026-04-22 19:24:21 +0000 UTC" firstStartedPulling="2026-04-22 19:24:22.356114096 +0000 UTC m=+51.406477565" lastFinishedPulling="2026-04-22 19:24:25.554716664 +0000 UTC m=+54.605080136" observedRunningTime="2026-04-22 19:24:26.031609162 +0000 UTC m=+55.081972652" watchObservedRunningTime="2026-04-22 19:24:26.031818213 +0000 UTC m=+55.082181702" Apr 22 19:24:29.021900 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:29.021866 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" event={"ID":"adb06a7e-4d1f-4568-83a5-1faeb9a98fab","Type":"ContainerStarted","Data":"8f98443998642f5375338b777f269c722c5464e3904e80d4d3b1913e2e736278"} Apr 22 19:24:29.021900 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:29.021900 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" event={"ID":"adb06a7e-4d1f-4568-83a5-1faeb9a98fab","Type":"ContainerStarted","Data":"03cf2c7035b38bb98618b6f7e25e7dc4d446bd2b7c47518a361ecd169d069b64"} Apr 22 19:24:29.046397 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:29.044627 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" podStartSLOduration=2.245963338 podStartE2EDuration="8.044607451s" podCreationTimestamp="2026-04-22 19:24:21 +0000 UTC" firstStartedPulling="2026-04-22 19:24:22.371978774 +0000 UTC m=+51.422342242" lastFinishedPulling="2026-04-22 19:24:28.170622873 +0000 UTC m=+57.220986355" observedRunningTime="2026-04-22 19:24:29.043474637 +0000 UTC m=+58.093838129" watchObservedRunningTime="2026-04-22 19:24:29.044607451 +0000 UTC m=+58.094970942" Apr 22 19:24:29.960504 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:29.960478 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cszr4" Apr 22 19:24:36.241089 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.241038 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:24:36.241089 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.241092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:24:36.241625 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.241215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:24:36.241625 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:36.241303 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:36.241625 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:36.241351 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:36.241625 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:36.241356 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert podName:44c36a5b-c1fd-4922-93c7-7d7e2ee8797e nodeName:}" failed. No retries permitted until 2026-04-22 19:25:08.241340156 +0000 UTC m=+97.291703625 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert") pod "ingress-canary-8vz9d" (UID: "44c36a5b-c1fd-4922-93c7-7d7e2ee8797e") : secret "canary-serving-cert" not found Apr 22 19:24:36.241625 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:36.241406 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls podName:638b7915-fc24-4304-b691-5e2dd5b5a7ce nodeName:}" failed. No retries permitted until 2026-04-22 19:25:08.241391249 +0000 UTC m=+97.291754723 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls") pod "dns-default-95lfv" (UID: "638b7915-fc24-4304-b691-5e2dd5b5a7ce") : secret "dns-default-metrics-tls" not found Apr 22 19:24:36.243841 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.243826 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:36.252291 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:36.252268 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:36.252426 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:24:36.252349 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.252328816 +0000 UTC m=+129.302692289 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : secret "metrics-daemon-secret" not found Apr 22 19:24:36.341956 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.341925 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:24:36.345194 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.345174 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:36.355242 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.355219 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:36.365779 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.365756 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84thh\" (UniqueName: \"kubernetes.io/projected/d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8-kube-api-access-84thh\") pod \"network-check-target-mch7s\" (UID: \"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8\") " pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:24:36.405062 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.405029 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l6zl6\"" Apr 22 19:24:36.413225 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.413200 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:24:36.538479 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:36.538446 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mch7s"] Apr 22 19:24:36.542254 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:24:36.542225 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5823fb1_e8f7_45fb_911a_f3cbcc56dfc8.slice/crio-6561ba06daf182742a2f2adf1f1a7dbe9474de3f3497df057a590ea7ee9b69e0 WatchSource:0}: Error finding container 6561ba06daf182742a2f2adf1f1a7dbe9474de3f3497df057a590ea7ee9b69e0: Status 404 returned error can't find the container with id 6561ba06daf182742a2f2adf1f1a7dbe9474de3f3497df057a590ea7ee9b69e0 Apr 22 19:24:37.040765 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:37.040731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mch7s" event={"ID":"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8","Type":"ContainerStarted","Data":"6561ba06daf182742a2f2adf1f1a7dbe9474de3f3497df057a590ea7ee9b69e0"} Apr 22 19:24:40.048947 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:40.048910 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mch7s" event={"ID":"d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8","Type":"ContainerStarted","Data":"114c1a26b40e50924dafe6ad4ee6e094cd94fab1075d4c5d83fb7e4a88a82fe0"} Apr 22 19:24:40.049380 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:40.049064 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:24:40.065790 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:24:40.065745 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mch7s" podStartSLOduration=66.206875104 podStartE2EDuration="1m9.065731662s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:24:36.54403741 +0000 UTC m=+65.594400879" lastFinishedPulling="2026-04-22 19:24:39.402893965 +0000 UTC m=+68.453257437" observedRunningTime="2026-04-22 19:24:40.065287071 +0000 UTC m=+69.115650564" watchObservedRunningTime="2026-04-22 19:24:40.065731662 +0000 UTC m=+69.116095152" Apr 22 19:25:08.268180 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:08.268052 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:25:08.268180 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:08.268093 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:25:08.268707 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:25:08.268221 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:25:08.268707 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:25:08.268231 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:25:08.268707 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:25:08.268297 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls podName:638b7915-fc24-4304-b691-5e2dd5b5a7ce nodeName:}" failed. No retries permitted until 2026-04-22 19:26:12.268274165 +0000 UTC m=+161.318637650 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls") pod "dns-default-95lfv" (UID: "638b7915-fc24-4304-b691-5e2dd5b5a7ce") : secret "dns-default-metrics-tls" not found Apr 22 19:25:08.268707 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:25:08.268316 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert podName:44c36a5b-c1fd-4922-93c7-7d7e2ee8797e nodeName:}" failed. No retries permitted until 2026-04-22 19:26:12.268304142 +0000 UTC m=+161.318667616 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert") pod "ingress-canary-8vz9d" (UID: "44c36a5b-c1fd-4922-93c7-7d7e2ee8797e") : secret "canary-serving-cert" not found Apr 22 19:25:11.053982 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:11.053953 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mch7s" Apr 22 19:25:19.401890 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:19.401864 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8w5lr_9d618274-e61e-4ac5-b98d-0316d3addc15/dns-node-resolver/0.log" Apr 22 19:25:20.400182 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:20.400155 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6gbdc_3b3a8d77-2840-4166-a03a-e49d2f4f7de6/node-ca/0.log" Apr 22 19:25:40.298089 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:40.298055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:25:40.298555 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:25:40.298206 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:40.298555 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:25:40.298299 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs podName:5df89727-eca2-4929-8ab4-9c1a7832889b nodeName:}" failed. No retries permitted until 2026-04-22 19:27:42.298281843 +0000 UTC m=+251.348645311 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs") pod "network-metrics-daemon-gwm2k" (UID: "5df89727-eca2-4929-8ab4-9c1a7832889b") : secret "metrics-daemon-secret" not found Apr 22 19:25:52.199509 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.199448 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" podUID="adb06a7e-4d1f-4568-83a5-1faeb9a98fab" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:25:52.503809 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.503772 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rq2j9"] Apr 22 19:25:52.506701 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.506683 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.513002 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.512975 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-np8h6\"" Apr 22 19:25:52.514274 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.513977 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:25:52.514274 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.513998 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:25:52.514274 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.514013 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:25:52.514274 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.514016 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:25:52.523773 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.523747 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rq2j9"] Apr 22 19:25:52.586049 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.586011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-data-volume\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.586226 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.586059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.586226 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.586171 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwgtf\" (UniqueName: \"kubernetes.io/projected/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-kube-api-access-mwgtf\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.586226 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.586204 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-crio-socket\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.586343 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.586234 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.686996 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.686956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.687191 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.687103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwgtf\" (UniqueName: \"kubernetes.io/projected/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-kube-api-access-mwgtf\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.687191 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.687126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-crio-socket\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.687191 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.687161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.687346 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.687243 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-crio-socket\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.687481 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.687439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-data-volume\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.687631 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.687563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.687799 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.687777 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-data-volume\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.689540 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.689517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.734951 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.734917 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwgtf\" (UniqueName: \"kubernetes.io/projected/f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865-kube-api-access-mwgtf\") pod \"insights-runtime-extractor-rq2j9\" (UID: \"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865\") " pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.819172 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.819078 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rq2j9" Apr 22 19:25:52.936266 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:52.936232 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rq2j9"] Apr 22 19:25:52.939677 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:25:52.939645 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ff4ce4_19f2_47d6_a74f_1ba8bf8b8865.slice/crio-48940b995553b4e6c9cdea4216d8a4e56b50b23796cab6d7de2c661079c9d01c WatchSource:0}: Error finding container 48940b995553b4e6c9cdea4216d8a4e56b50b23796cab6d7de2c661079c9d01c: Status 404 returned error can't find the container with id 48940b995553b4e6c9cdea4216d8a4e56b50b23796cab6d7de2c661079c9d01c Apr 22 19:25:53.217551 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:53.217469 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rq2j9" event={"ID":"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865","Type":"ContainerStarted","Data":"c11d6bc2957b436c0fee460bfdce9e7d57ddd99e049c6644e9912cfa55e8940f"} Apr 22 19:25:53.217551 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:53.217510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rq2j9" event={"ID":"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865","Type":"ContainerStarted","Data":"48940b995553b4e6c9cdea4216d8a4e56b50b23796cab6d7de2c661079c9d01c"} Apr 22 19:25:54.221566 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:54.221527 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rq2j9" event={"ID":"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865","Type":"ContainerStarted","Data":"6ff988f6e34d8031732c8f8215351fe7abc6a17856561030460163e7e9e699f1"} Apr 22 19:25:55.225367 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.225335 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rq2j9" event={"ID":"f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865","Type":"ContainerStarted","Data":"476456d75af636b0df6a74128403a478cc77294019ff195bd388607c93df091a"} Apr 22 19:25:55.243643 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.243591 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rq2j9" podStartSLOduration=1.346772346 podStartE2EDuration="3.243553902s" podCreationTimestamp="2026-04-22 19:25:52 +0000 UTC" firstStartedPulling="2026-04-22 19:25:52.990498471 +0000 UTC m=+142.040861940" lastFinishedPulling="2026-04-22 19:25:54.887280015 +0000 UTC m=+143.937643496" observedRunningTime="2026-04-22 19:25:55.242168693 +0000 UTC m=+144.292532184" watchObservedRunningTime="2026-04-22 19:25:55.243553902 +0000 UTC m=+144.293917384" Apr 22 19:25:55.399911 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.399876 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t2lpj"] Apr 22 19:25:55.402910 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.402892 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.405473 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.405445 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 19:25:55.405630 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.405478 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 19:25:55.405630 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.405514 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:25:55.405737 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.405685 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-tvqfs\"" Apr 22 19:25:55.406075 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.406055 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:25:55.406165 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.406086 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:25:55.412982 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.412959 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t2lpj"] Apr 22 19:25:55.508818 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.508722 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eba12bbc-2b28-461b-b407-96b6fb2de23e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.508818 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.508779 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.508999 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.508837 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5m8j\" (UniqueName: \"kubernetes.io/projected/eba12bbc-2b28-461b-b407-96b6fb2de23e-kube-api-access-k5m8j\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.508999 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.508877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.610085 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.610038 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eba12bbc-2b28-461b-b407-96b6fb2de23e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.610277 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.610102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.610277 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.610130 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5m8j\" (UniqueName: \"kubernetes.io/projected/eba12bbc-2b28-461b-b407-96b6fb2de23e-kube-api-access-k5m8j\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.610277 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.610157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.610277 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:25:55.610267 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 19:25:55.610451 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:25:55.610337 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-tls podName:eba12bbc-2b28-461b-b407-96b6fb2de23e nodeName:}" failed. No retries permitted until 2026-04-22 19:25:56.110313536 +0000 UTC m=+145.160677011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-t2lpj" (UID: "eba12bbc-2b28-461b-b407-96b6fb2de23e") : secret "prometheus-operator-tls" not found Apr 22 19:25:55.610830 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.610802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eba12bbc-2b28-461b-b407-96b6fb2de23e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.612526 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.612506 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:55.619011 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:55.618988 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5m8j\" (UniqueName: \"kubernetes.io/projected/eba12bbc-2b28-461b-b407-96b6fb2de23e-kube-api-access-k5m8j\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:56.115082 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:56.115037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:56.117464 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:56.117444 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/eba12bbc-2b28-461b-b407-96b6fb2de23e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t2lpj\" (UID: \"eba12bbc-2b28-461b-b407-96b6fb2de23e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:56.311779 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:56.311742 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" Apr 22 19:25:56.432204 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:56.432172 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t2lpj"] Apr 22 19:25:56.436166 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:25:56.436138 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeba12bbc_2b28_461b_b407_96b6fb2de23e.slice/crio-c02fefc6ceb7c387db9ab43e9f5686471b110644cae49ff4fe5c0ea483491c89 WatchSource:0}: Error finding container c02fefc6ceb7c387db9ab43e9f5686471b110644cae49ff4fe5c0ea483491c89: Status 404 returned error can't find the container with id c02fefc6ceb7c387db9ab43e9f5686471b110644cae49ff4fe5c0ea483491c89 Apr 22 19:25:57.235314 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:57.235276 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" event={"ID":"eba12bbc-2b28-461b-b407-96b6fb2de23e","Type":"ContainerStarted","Data":"c02fefc6ceb7c387db9ab43e9f5686471b110644cae49ff4fe5c0ea483491c89"} Apr 22 19:25:58.240131 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:58.240099 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" event={"ID":"eba12bbc-2b28-461b-b407-96b6fb2de23e","Type":"ContainerStarted","Data":"06ed5357795dcd0ea3d8a38d4a4341789dc6c6adb1120d1d691bb23c0e47c37b"} Apr 22 19:25:58.240131 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:58.240134 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" event={"ID":"eba12bbc-2b28-461b-b407-96b6fb2de23e","Type":"ContainerStarted","Data":"2236fea03f3c8fce3f9d8fc1041836200c87cdbc1c7361225ec670edc42fe8f1"} Apr 22 19:25:58.259400 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:58.259349 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2lpj" podStartSLOduration=1.922396711 podStartE2EDuration="3.259335594s" podCreationTimestamp="2026-04-22 19:25:55 +0000 UTC" firstStartedPulling="2026-04-22 19:25:56.438000834 +0000 UTC m=+145.488364303" lastFinishedPulling="2026-04-22 19:25:57.774939714 +0000 UTC m=+146.825303186" observedRunningTime="2026-04-22 19:25:58.258025764 +0000 UTC m=+147.308389265" watchObservedRunningTime="2026-04-22 19:25:58.259335594 +0000 UTC m=+147.309699083" Apr 22 19:25:59.833806 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.833753 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n7wh4"] Apr 22 19:25:59.836946 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.836917 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.840810 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.840777 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:25:59.842589 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.842551 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nfpmc\"" Apr 22 19:25:59.842802 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.842769 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:25:59.843077 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.843059 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:25:59.943792 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.943760 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-wtmp\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.943792 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.943802 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.944050 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.943827 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-tls\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.944050 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.943920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-root\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.944050 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.943957 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5crb\" (UniqueName: \"kubernetes.io/projected/b5f7f302-1d58-4544-8b02-0f35e261666a-kube-api-access-d5crb\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.944050 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.944025 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-accelerators-collector-config\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.944169 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.944065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5f7f302-1d58-4544-8b02-0f35e261666a-metrics-client-ca\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.944169 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.944106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-sys\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:25:59.944169 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:25:59.944130 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-textfile\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.044853 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.044816 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-wtmp\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.044863 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.044887 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-tls\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.044922 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-root\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.044946 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5crb\" (UniqueName: \"kubernetes.io/projected/b5f7f302-1d58-4544-8b02-0f35e261666a-kube-api-access-d5crb\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.044970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-accelerators-collector-config\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.044989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5f7f302-1d58-4544-8b02-0f35e261666a-metrics-client-ca\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.045002 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-wtmp\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.045021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-sys\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045402 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.045057 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-textfile\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045402 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.045074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-sys\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045402 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.045139 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5f7f302-1d58-4544-8b02-0f35e261666a-root\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045708 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.045680 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-accelerators-collector-config\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.045816 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.045737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5f7f302-1d58-4544-8b02-0f35e261666a-metrics-client-ca\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.047058 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.047037 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-textfile\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.047301 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.047281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-tls\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.047445 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.047424 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5f7f302-1d58-4544-8b02-0f35e261666a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.052901 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.052883 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5crb\" (UniqueName: \"kubernetes.io/projected/b5f7f302-1d58-4544-8b02-0f35e261666a-kube-api-access-d5crb\") pod \"node-exporter-n7wh4\" (UID: \"b5f7f302-1d58-4544-8b02-0f35e261666a\") " pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.146179 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.146087 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n7wh4" Apr 22 19:26:00.156081 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:26:00.156048 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5f7f302_1d58_4544_8b02_0f35e261666a.slice/crio-055f5c2539dc7a9043dc3c6a66833c895b3030e2155b8e782484551a0c143491 WatchSource:0}: Error finding container 055f5c2539dc7a9043dc3c6a66833c895b3030e2155b8e782484551a0c143491: Status 404 returned error can't find the container with id 055f5c2539dc7a9043dc3c6a66833c895b3030e2155b8e782484551a0c143491 Apr 22 19:26:00.246721 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:00.246688 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n7wh4" event={"ID":"b5f7f302-1d58-4544-8b02-0f35e261666a","Type":"ContainerStarted","Data":"055f5c2539dc7a9043dc3c6a66833c895b3030e2155b8e782484551a0c143491"} Apr 22 19:26:01.250559 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:01.250510 2570 generic.go:358] "Generic (PLEG): container finished" podID="b5f7f302-1d58-4544-8b02-0f35e261666a" containerID="97ba2b417ea05c7cefa565b38e25d480d1648a674c0016d063b3fd46082a1000" exitCode=0 Apr 22 19:26:01.251021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:01.250645 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n7wh4" event={"ID":"b5f7f302-1d58-4544-8b02-0f35e261666a","Type":"ContainerDied","Data":"97ba2b417ea05c7cefa565b38e25d480d1648a674c0016d063b3fd46082a1000"} Apr 22 19:26:02.199885 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:02.199836 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" podUID="adb06a7e-4d1f-4568-83a5-1faeb9a98fab" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:26:02.254916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:02.254879 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n7wh4" event={"ID":"b5f7f302-1d58-4544-8b02-0f35e261666a","Type":"ContainerStarted","Data":"8decaa3200799ff7c123ed9905fb613563ded5ed4973719cf659de6b3d30c3b1"} Apr 22 19:26:02.254916 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:02.254917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n7wh4" event={"ID":"b5f7f302-1d58-4544-8b02-0f35e261666a","Type":"ContainerStarted","Data":"c05c65006e9a8d1d08cbc2e71c0a7e001afb134cc639b712d52c1afec4dc90db"} Apr 22 19:26:02.279608 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:02.279533 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n7wh4" podStartSLOduration=2.583398378 podStartE2EDuration="3.279515905s" podCreationTimestamp="2026-04-22 19:25:59 +0000 UTC" firstStartedPulling="2026-04-22 19:26:00.157853146 +0000 UTC m=+149.208216614" lastFinishedPulling="2026-04-22 19:26:00.853970669 +0000 UTC m=+149.904334141" observedRunningTime="2026-04-22 19:26:02.277831426 +0000 UTC m=+151.328194917" watchObservedRunningTime="2026-04-22 19:26:02.279515905 +0000 UTC m=+151.329879388" Apr 22 19:26:04.620414 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.620378 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8"] Apr 22 19:26:04.623588 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.623558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" Apr 22 19:26:04.627779 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.627748 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 19:26:04.627906 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.627794 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-jctk4\"" Apr 22 19:26:04.640211 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.640186 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8"] Apr 22 19:26:04.781079 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.781038 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/865bb00c-3a97-4aab-a7bd-cca3ef857f4f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9zrx8\" (UID: \"865bb00c-3a97-4aab-a7bd-cca3ef857f4f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" Apr 22 19:26:04.882429 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.882344 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/865bb00c-3a97-4aab-a7bd-cca3ef857f4f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9zrx8\" (UID: \"865bb00c-3a97-4aab-a7bd-cca3ef857f4f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" Apr 22 19:26:04.884778 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.884756 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/865bb00c-3a97-4aab-a7bd-cca3ef857f4f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9zrx8\" (UID: \"865bb00c-3a97-4aab-a7bd-cca3ef857f4f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" Apr 22 19:26:04.931917 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:04.931886 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" Apr 22 19:26:05.047238 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:05.047203 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8"] Apr 22 19:26:05.050067 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:26:05.050037 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865bb00c_3a97_4aab_a7bd_cca3ef857f4f.slice/crio-5522f64857a4fa9be556415d39ddcffa87c815f84b5dc8fd1358fd55a9517904 WatchSource:0}: Error finding container 5522f64857a4fa9be556415d39ddcffa87c815f84b5dc8fd1358fd55a9517904: Status 404 returned error can't find the container with id 5522f64857a4fa9be556415d39ddcffa87c815f84b5dc8fd1358fd55a9517904 Apr 22 19:26:05.263855 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:05.263815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" event={"ID":"865bb00c-3a97-4aab-a7bd-cca3ef857f4f","Type":"ContainerStarted","Data":"5522f64857a4fa9be556415d39ddcffa87c815f84b5dc8fd1358fd55a9517904"} Apr 22 19:26:07.272069 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:07.272031 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" event={"ID":"865bb00c-3a97-4aab-a7bd-cca3ef857f4f","Type":"ContainerStarted","Data":"fb81fcf9d7f98ab091fa04a45a5f644050b963960ea31e934a890cfdbb6b1186"} Apr 22 19:26:07.272462 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:07.272246 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" Apr 22 19:26:07.276751 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:07.276729 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" Apr 22 19:26:07.287064 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:07.287024 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9zrx8" podStartSLOduration=1.7350700940000001 podStartE2EDuration="3.2870113s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:05.051955944 +0000 UTC m=+154.102319412" lastFinishedPulling="2026-04-22 19:26:06.60389715 +0000 UTC m=+155.654260618" observedRunningTime="2026-04-22 19:26:07.286496194 +0000 UTC m=+156.336859685" watchObservedRunningTime="2026-04-22 19:26:07.2870113 +0000 UTC m=+156.337374791" Apr 22 19:26:07.402819 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:26:07.402781 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-95lfv" podUID="638b7915-fc24-4304-b691-5e2dd5b5a7ce" Apr 22 19:26:07.417432 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:26:07.417408 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8vz9d" podUID="44c36a5b-c1fd-4922-93c7-7d7e2ee8797e" Apr 22 19:26:07.609922 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:26:07.609836 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gwm2k" podUID="5df89727-eca2-4929-8ab4-9c1a7832889b" Apr 22 19:26:08.274241 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:08.274210 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-95lfv" Apr 22 19:26:09.341705 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.341668 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8577c4c8b-fnzp4"] Apr 22 19:26:09.344726 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.344710 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.347357 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.347336 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:26:09.347451 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.347386 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:26:09.348448 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.348429 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:26:09.348673 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.348646 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:26:09.348673 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.348429 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:26:09.348832 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.348429 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qlpqn\"" Apr 22 19:26:09.348832 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.348708 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:26:09.348832 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.348431 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:26:09.353387 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.353355 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:26:09.353741 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.353722 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8577c4c8b-fnzp4"] Apr 22 19:26:09.514256 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.514216 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-oauth-serving-cert\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.514442 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.514316 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-service-ca\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.514442 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.514339 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6845\" (UniqueName: \"kubernetes.io/projected/5292ef93-da54-4383-bd75-5e3fda0bf670-kube-api-access-q6845\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.514442 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.514361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-serving-cert\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.514442 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.514387 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-oauth-config\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.514626 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.514445 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-console-config\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.514626 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.514466 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-trusted-ca-bundle\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.614914 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.614826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-oauth-serving-cert\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.614914 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.614904 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-service-ca\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.615094 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.614929 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6845\" (UniqueName: \"kubernetes.io/projected/5292ef93-da54-4383-bd75-5e3fda0bf670-kube-api-access-q6845\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.615094 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.614956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-serving-cert\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.615263 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.615240 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-oauth-config\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.615351 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.615334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-console-config\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.615410 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.615359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-trusted-ca-bundle\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.615664 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.615639 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-oauth-serving-cert\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.615758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.615670 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-service-ca\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.616045 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.616023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-console-config\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.616257 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.616241 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-trusted-ca-bundle\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.617613 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.617586 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-serving-cert\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.617714 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.617616 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-oauth-config\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.623870 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.623850 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6845\" (UniqueName: \"kubernetes.io/projected/5292ef93-da54-4383-bd75-5e3fda0bf670-kube-api-access-q6845\") pod \"console-8577c4c8b-fnzp4\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.654354 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.654319 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:09.773591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:09.773548 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8577c4c8b-fnzp4"] Apr 22 19:26:09.775984 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:26:09.775960 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5292ef93_da54_4383_bd75_5e3fda0bf670.slice/crio-095a00688e6b3d83793618d19627ef42b2c9040c9ec5076d69b58a257005c436 WatchSource:0}: Error finding container 095a00688e6b3d83793618d19627ef42b2c9040c9ec5076d69b58a257005c436: Status 404 returned error can't find the container with id 095a00688e6b3d83793618d19627ef42b2c9040c9ec5076d69b58a257005c436 Apr 22 19:26:10.281367 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:10.281278 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8577c4c8b-fnzp4" event={"ID":"5292ef93-da54-4383-bd75-5e3fda0bf670","Type":"ContainerStarted","Data":"095a00688e6b3d83793618d19627ef42b2c9040c9ec5076d69b58a257005c436"} Apr 22 19:26:12.199339 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.199275 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" podUID="adb06a7e-4d1f-4568-83a5-1faeb9a98fab" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:26:12.199910 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.199370 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" Apr 22 19:26:12.199966 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.199927 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"8f98443998642f5375338b777f269c722c5464e3904e80d4d3b1913e2e736278"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 19:26:12.200021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.200003 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" podUID="adb06a7e-4d1f-4568-83a5-1faeb9a98fab" containerName="service-proxy" containerID="cri-o://8f98443998642f5375338b777f269c722c5464e3904e80d4d3b1913e2e736278" gracePeriod=30 Apr 22 19:26:12.339587 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.339534 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:26:12.339783 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.339615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:26:12.342387 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.342359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c36a5b-c1fd-4922-93c7-7d7e2ee8797e-cert\") pod \"ingress-canary-8vz9d\" (UID: \"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e\") " pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:26:12.342530 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.342429 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/638b7915-fc24-4304-b691-5e2dd5b5a7ce-metrics-tls\") pod \"dns-default-95lfv\" (UID: \"638b7915-fc24-4304-b691-5e2dd5b5a7ce\") " pod="openshift-dns/dns-default-95lfv" Apr 22 19:26:12.477456 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.477433 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8zlb\"" Apr 22 19:26:12.485612 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.485594 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-95lfv" Apr 22 19:26:12.642819 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:12.641273 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-95lfv"] Apr 22 19:26:12.643695 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:26:12.643649 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod638b7915_fc24_4304_b691_5e2dd5b5a7ce.slice/crio-e5620d7cc135b5a4e7b1f6221e6e9cf53572e156c0f70bf5c767aaa424079d6b WatchSource:0}: Error finding container e5620d7cc135b5a4e7b1f6221e6e9cf53572e156c0f70bf5c767aaa424079d6b: Status 404 returned error can't find the container with id e5620d7cc135b5a4e7b1f6221e6e9cf53572e156c0f70bf5c767aaa424079d6b Apr 22 19:26:13.291185 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:13.291147 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-95lfv" event={"ID":"638b7915-fc24-4304-b691-5e2dd5b5a7ce","Type":"ContainerStarted","Data":"e5620d7cc135b5a4e7b1f6221e6e9cf53572e156c0f70bf5c767aaa424079d6b"} Apr 22 19:26:13.293391 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:13.293361 2570 generic.go:358] "Generic (PLEG): container finished" podID="adb06a7e-4d1f-4568-83a5-1faeb9a98fab" containerID="8f98443998642f5375338b777f269c722c5464e3904e80d4d3b1913e2e736278" exitCode=2 Apr 22 19:26:13.293537 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:13.293436 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" event={"ID":"adb06a7e-4d1f-4568-83a5-1faeb9a98fab","Type":"ContainerDied","Data":"8f98443998642f5375338b777f269c722c5464e3904e80d4d3b1913e2e736278"} Apr 22 19:26:13.293537 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:13.293463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-888454476-wklt4" event={"ID":"adb06a7e-4d1f-4568-83a5-1faeb9a98fab","Type":"ContainerStarted","Data":"f5a1813406fb3cfe03811c3a0b5aa0867463d0e929f902bfc0375e751ac5278c"} Apr 22 19:26:13.295034 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:13.294998 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8577c4c8b-fnzp4" event={"ID":"5292ef93-da54-4383-bd75-5e3fda0bf670","Type":"ContainerStarted","Data":"e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0"} Apr 22 19:26:13.330399 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:13.330333 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8577c4c8b-fnzp4" podStartSLOduration=1.625579978 podStartE2EDuration="4.330310047s" podCreationTimestamp="2026-04-22 19:26:09 +0000 UTC" firstStartedPulling="2026-04-22 19:26:09.777752993 +0000 UTC m=+158.828116461" lastFinishedPulling="2026-04-22 19:26:12.482483061 +0000 UTC m=+161.532846530" observedRunningTime="2026-04-22 19:26:13.329782056 +0000 UTC m=+162.380145548" watchObservedRunningTime="2026-04-22 19:26:13.330310047 +0000 UTC m=+162.380673539" Apr 22 19:26:14.299942 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:14.299905 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-95lfv" event={"ID":"638b7915-fc24-4304-b691-5e2dd5b5a7ce","Type":"ContainerStarted","Data":"29b9004e207cca5e53adb55d57dbecfaf2cb401506f88add7df6f6f5eeac05fd"} Apr 22 19:26:15.304913 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:15.304867 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-95lfv" event={"ID":"638b7915-fc24-4304-b691-5e2dd5b5a7ce","Type":"ContainerStarted","Data":"69b784775c3d9b9d0b3f07326e956e2756abf5e5a957f68b55538b03df0f90d8"} Apr 22 19:26:15.305299 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:15.304982 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-95lfv" Apr 22 19:26:15.323511 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:15.323451 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-95lfv" podStartSLOduration=129.859769456 podStartE2EDuration="2m11.323432275s" podCreationTimestamp="2026-04-22 19:24:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:12.645785805 +0000 UTC m=+161.696149277" lastFinishedPulling="2026-04-22 19:26:14.109448623 +0000 UTC m=+163.159812096" observedRunningTime="2026-04-22 19:26:15.322727279 +0000 UTC m=+164.373090781" watchObservedRunningTime="2026-04-22 19:26:15.323432275 +0000 UTC m=+164.373795770" Apr 22 19:26:18.591421 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:18.591387 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:26:18.591817 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:18.591385 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:26:18.594167 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:18.594152 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8fgp\"" Apr 22 19:26:18.602450 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:18.602428 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8vz9d" Apr 22 19:26:18.723022 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:18.722990 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8vz9d"] Apr 22 19:26:18.728295 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:26:18.728259 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c36a5b_c1fd_4922_93c7_7d7e2ee8797e.slice/crio-58129fce0fb6482ae24789a9f63afcf6d29ec28269ea67448ed41a5fc787345d WatchSource:0}: Error finding container 58129fce0fb6482ae24789a9f63afcf6d29ec28269ea67448ed41a5fc787345d: Status 404 returned error can't find the container with id 58129fce0fb6482ae24789a9f63afcf6d29ec28269ea67448ed41a5fc787345d Apr 22 19:26:19.317374 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:19.317329 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8vz9d" event={"ID":"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e","Type":"ContainerStarted","Data":"58129fce0fb6482ae24789a9f63afcf6d29ec28269ea67448ed41a5fc787345d"} Apr 22 19:26:19.655249 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:19.655152 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:19.655249 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:19.655207 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:19.660706 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:19.660683 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:20.325273 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:20.325236 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8vz9d" event={"ID":"44c36a5b-c1fd-4922-93c7-7d7e2ee8797e","Type":"ContainerStarted","Data":"3b8b4b5ac7b39f64a63230a3c65c10b0cc1cb6ee5913d46730b719822e8fb458"} Apr 22 19:26:20.329424 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:20.329385 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:26:20.347974 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:20.347925 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8vz9d" podStartSLOduration=134.855221795 podStartE2EDuration="2m16.347910031s" podCreationTimestamp="2026-04-22 19:24:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:18.730631676 +0000 UTC m=+167.780995144" lastFinishedPulling="2026-04-22 19:26:20.223319909 +0000 UTC m=+169.273683380" observedRunningTime="2026-04-22 19:26:20.347402961 +0000 UTC m=+169.397766452" watchObservedRunningTime="2026-04-22 19:26:20.347910031 +0000 UTC m=+169.398273521" Apr 22 19:26:25.310021 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:26:25.309987 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-95lfv" Apr 22 19:27:27.136589 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.136536 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c9d8c7b5f-54bd5"] Apr 22 19:27:27.138510 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.138493 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.148192 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.148159 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c9d8c7b5f-54bd5"] Apr 22 19:27:27.211599 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.211536 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-config\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.211599 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.211602 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-oauth-serving-cert\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.211814 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.211630 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnp2s\" (UniqueName: \"kubernetes.io/projected/589f5379-60ee-4dd9-ab5f-03768eacea7a-kube-api-access-xnp2s\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.211814 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.211661 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-trusted-ca-bundle\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.211814 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.211695 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-service-ca\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.211814 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.211720 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-serving-cert\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.211814 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.211741 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-oauth-config\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.312539 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.312499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-trusted-ca-bundle\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.312740 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.312550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-service-ca\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.312740 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.312695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-serving-cert\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.312865 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.312741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-oauth-config\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.312865 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.312831 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-config\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.312962 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.312873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-oauth-serving-cert\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.312962 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.312899 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnp2s\" (UniqueName: \"kubernetes.io/projected/589f5379-60ee-4dd9-ab5f-03768eacea7a-kube-api-access-xnp2s\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.314011 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.313959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-service-ca\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.314011 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.313974 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-oauth-serving-cert\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.314186 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.314071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-config\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.314333 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.314308 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-trusted-ca-bundle\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.315241 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.315219 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-oauth-config\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.315483 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.315407 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-serving-cert\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.325047 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.325019 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnp2s\" (UniqueName: \"kubernetes.io/projected/589f5379-60ee-4dd9-ab5f-03768eacea7a-kube-api-access-xnp2s\") pod \"console-5c9d8c7b5f-54bd5\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.447496 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.447401 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:27.575909 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:27.574153 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c9d8c7b5f-54bd5"] Apr 22 19:27:28.497189 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:28.497151 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c9d8c7b5f-54bd5" event={"ID":"589f5379-60ee-4dd9-ab5f-03768eacea7a","Type":"ContainerStarted","Data":"ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9"} Apr 22 19:27:28.497189 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:28.497191 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c9d8c7b5f-54bd5" event={"ID":"589f5379-60ee-4dd9-ab5f-03768eacea7a","Type":"ContainerStarted","Data":"5c3ed277c21d73447161ed3c2ba2665f2526cb71e5d6835d0810477fbce26134"} Apr 22 19:27:28.516035 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:28.515986 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c9d8c7b5f-54bd5" podStartSLOduration=1.51597205 podStartE2EDuration="1.51597205s" podCreationTimestamp="2026-04-22 19:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:28.514806855 +0000 UTC m=+237.565170345" watchObservedRunningTime="2026-04-22 19:27:28.51597205 +0000 UTC m=+237.566335540" Apr 22 19:27:37.448141 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:37.448093 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:37.448141 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:37.448147 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:37.453004 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:37.452979 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:37.522946 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:37.522910 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:27:37.567815 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:37.567784 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8577c4c8b-fnzp4"] Apr 22 19:27:42.330468 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:42.330422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:27:42.332744 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:42.332715 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5df89727-eca2-4929-8ab4-9c1a7832889b-metrics-certs\") pod \"network-metrics-daemon-gwm2k\" (UID: \"5df89727-eca2-4929-8ab4-9c1a7832889b\") " pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:27:42.594779 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:42.594699 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kmhxv\"" Apr 22 19:27:42.602477 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:42.602442 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gwm2k" Apr 22 19:27:42.719217 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:42.719187 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gwm2k"] Apr 22 19:27:42.722654 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:27:42.722626 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df89727_eca2_4929_8ab4_9c1a7832889b.slice/crio-3e7cbbedee4240d727851abb84591a57ed2f7d35892e8987db96d0427783e6c7 WatchSource:0}: Error finding container 3e7cbbedee4240d727851abb84591a57ed2f7d35892e8987db96d0427783e6c7: Status 404 returned error can't find the container with id 3e7cbbedee4240d727851abb84591a57ed2f7d35892e8987db96d0427783e6c7 Apr 22 19:27:43.536320 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:43.536284 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gwm2k" event={"ID":"5df89727-eca2-4929-8ab4-9c1a7832889b","Type":"ContainerStarted","Data":"3e7cbbedee4240d727851abb84591a57ed2f7d35892e8987db96d0427783e6c7"} Apr 22 19:27:44.540727 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:44.540689 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gwm2k" event={"ID":"5df89727-eca2-4929-8ab4-9c1a7832889b","Type":"ContainerStarted","Data":"31bc066516651b2ac2815f1c407358cf4e66bca2564500074faff13f620a1a48"} Apr 22 19:27:44.540727 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:44.540728 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gwm2k" event={"ID":"5df89727-eca2-4929-8ab4-9c1a7832889b","Type":"ContainerStarted","Data":"8e63323fa59be5fc7bc6f440e91893538593914549dae30dc11e3ffd33bd2c42"} Apr 22 19:27:44.557620 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:27:44.557550 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gwm2k" podStartSLOduration=252.709917615 podStartE2EDuration="4m13.557535644s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:27:42.724413856 +0000 UTC m=+251.774777337" lastFinishedPulling="2026-04-22 19:27:43.572031883 +0000 UTC m=+252.622395366" observedRunningTime="2026-04-22 19:27:44.557492294 +0000 UTC m=+253.607855787" watchObservedRunningTime="2026-04-22 19:27:44.557535644 +0000 UTC m=+253.607899112" Apr 22 19:28:02.586684 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.586594 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8577c4c8b-fnzp4" podUID="5292ef93-da54-4383-bd75-5e3fda0bf670" containerName="console" containerID="cri-o://e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0" gracePeriod=15 Apr 22 19:28:02.827399 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.827374 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8577c4c8b-fnzp4_5292ef93-da54-4383-bd75-5e3fda0bf670/console/0.log" Apr 22 19:28:02.827544 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.827447 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:28:02.991061 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.991031 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-console-config\") pod \"5292ef93-da54-4383-bd75-5e3fda0bf670\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " Apr 22 19:28:02.991268 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.991075 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-service-ca\") pod \"5292ef93-da54-4383-bd75-5e3fda0bf670\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " Apr 22 19:28:02.991268 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.991128 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-oauth-serving-cert\") pod \"5292ef93-da54-4383-bd75-5e3fda0bf670\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " Apr 22 19:28:02.991268 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.991235 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-serving-cert\") pod \"5292ef93-da54-4383-bd75-5e3fda0bf670\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " Apr 22 19:28:02.991425 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.991274 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6845\" (UniqueName: \"kubernetes.io/projected/5292ef93-da54-4383-bd75-5e3fda0bf670-kube-api-access-q6845\") pod \"5292ef93-da54-4383-bd75-5e3fda0bf670\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " Apr 22 19:28:02.991425 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.991309 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-oauth-config\") pod \"5292ef93-da54-4383-bd75-5e3fda0bf670\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " Apr 22 19:28:02.991425 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.991334 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-trusted-ca-bundle\") pod \"5292ef93-da54-4383-bd75-5e3fda0bf670\" (UID: \"5292ef93-da54-4383-bd75-5e3fda0bf670\") " Apr 22 19:28:02.992013 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.991973 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5292ef93-da54-4383-bd75-5e3fda0bf670" (UID: "5292ef93-da54-4383-bd75-5e3fda0bf670"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:02.992146 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.992089 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5292ef93-da54-4383-bd75-5e3fda0bf670" (UID: "5292ef93-da54-4383-bd75-5e3fda0bf670"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:02.992250 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.992224 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-service-ca" (OuterVolumeSpecName: "service-ca") pod "5292ef93-da54-4383-bd75-5e3fda0bf670" (UID: "5292ef93-da54-4383-bd75-5e3fda0bf670"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:02.992316 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.992259 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-console-config" (OuterVolumeSpecName: "console-config") pod "5292ef93-da54-4383-bd75-5e3fda0bf670" (UID: "5292ef93-da54-4383-bd75-5e3fda0bf670"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:02.992435 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.992418 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-oauth-serving-cert\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:28:02.994237 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.994187 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5292ef93-da54-4383-bd75-5e3fda0bf670-kube-api-access-q6845" (OuterVolumeSpecName: "kube-api-access-q6845") pod "5292ef93-da54-4383-bd75-5e3fda0bf670" (UID: "5292ef93-da54-4383-bd75-5e3fda0bf670"). InnerVolumeSpecName "kube-api-access-q6845". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:02.994356 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.994205 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5292ef93-da54-4383-bd75-5e3fda0bf670" (UID: "5292ef93-da54-4383-bd75-5e3fda0bf670"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:02.997067 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:02.997032 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5292ef93-da54-4383-bd75-5e3fda0bf670" (UID: "5292ef93-da54-4383-bd75-5e3fda0bf670"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:03.093616 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.093561 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-serving-cert\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:28:03.093616 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.093612 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q6845\" (UniqueName: \"kubernetes.io/projected/5292ef93-da54-4383-bd75-5e3fda0bf670-kube-api-access-q6845\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:28:03.093616 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.093622 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5292ef93-da54-4383-bd75-5e3fda0bf670-console-oauth-config\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:28:03.093837 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.093631 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-trusted-ca-bundle\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:28:03.093837 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.093641 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-console-config\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:28:03.093837 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.093650 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5292ef93-da54-4383-bd75-5e3fda0bf670-service-ca\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:28:03.592236 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.592212 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8577c4c8b-fnzp4_5292ef93-da54-4383-bd75-5e3fda0bf670/console/0.log" Apr 22 19:28:03.592620 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.592249 2570 generic.go:358] "Generic (PLEG): container finished" podID="5292ef93-da54-4383-bd75-5e3fda0bf670" containerID="e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0" exitCode=2 Apr 22 19:28:03.592620 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.592345 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8577c4c8b-fnzp4" Apr 22 19:28:03.594425 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.594398 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8577c4c8b-fnzp4" event={"ID":"5292ef93-da54-4383-bd75-5e3fda0bf670","Type":"ContainerDied","Data":"e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0"} Apr 22 19:28:03.594558 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.594429 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8577c4c8b-fnzp4" event={"ID":"5292ef93-da54-4383-bd75-5e3fda0bf670","Type":"ContainerDied","Data":"095a00688e6b3d83793618d19627ef42b2c9040c9ec5076d69b58a257005c436"} Apr 22 19:28:03.594558 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.594444 2570 scope.go:117] "RemoveContainer" containerID="e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0" Apr 22 19:28:03.602900 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.602880 2570 scope.go:117] "RemoveContainer" containerID="e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0" Apr 22 19:28:03.603183 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:28:03.603160 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0\": container with ID starting with e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0 not found: ID does not exist" containerID="e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0" Apr 22 19:28:03.603276 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.603189 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0"} err="failed to get container status \"e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0\": rpc error: code = NotFound desc = could not find container \"e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0\": container with ID starting with e129b803cbeed6ed485b34f5d7bd9667b780b4a3b4f81e30905afe664c959dc0 not found: ID does not exist" Apr 22 19:28:03.617891 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.617862 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8577c4c8b-fnzp4"] Apr 22 19:28:03.621050 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:03.621026 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8577c4c8b-fnzp4"] Apr 22 19:28:05.595131 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:05.595091 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5292ef93-da54-4383-bd75-5e3fda0bf670" path="/var/lib/kubelet/pods/5292ef93-da54-4383-bd75-5e3fda0bf670/volumes" Apr 22 19:28:31.473746 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:31.473713 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:28:31.474258 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:31.473714 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:28:31.479088 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:31.479060 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:28:46.866151 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:28:46.866116 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c9d8c7b5f-54bd5"] Apr 22 19:29:11.887894 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:11.887850 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c9d8c7b5f-54bd5" podUID="589f5379-60ee-4dd9-ab5f-03768eacea7a" containerName="console" containerID="cri-o://ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9" gracePeriod=15 Apr 22 19:29:12.124659 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.124636 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c9d8c7b5f-54bd5_589f5379-60ee-4dd9-ab5f-03768eacea7a/console/0.log" Apr 22 19:29:12.124781 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.124698 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:29:12.181261 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181182 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-oauth-serving-cert\") pod \"589f5379-60ee-4dd9-ab5f-03768eacea7a\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " Apr 22 19:29:12.181261 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181224 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-serving-cert\") pod \"589f5379-60ee-4dd9-ab5f-03768eacea7a\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " Apr 22 19:29:12.181457 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181273 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnp2s\" (UniqueName: \"kubernetes.io/projected/589f5379-60ee-4dd9-ab5f-03768eacea7a-kube-api-access-xnp2s\") pod \"589f5379-60ee-4dd9-ab5f-03768eacea7a\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " Apr 22 19:29:12.181457 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181297 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-oauth-config\") pod \"589f5379-60ee-4dd9-ab5f-03768eacea7a\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " Apr 22 19:29:12.181457 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181344 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-trusted-ca-bundle\") pod \"589f5379-60ee-4dd9-ab5f-03768eacea7a\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " Apr 22 19:29:12.181457 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181370 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-config\") pod \"589f5379-60ee-4dd9-ab5f-03768eacea7a\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " Apr 22 19:29:12.181457 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181416 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-service-ca\") pod \"589f5379-60ee-4dd9-ab5f-03768eacea7a\" (UID: \"589f5379-60ee-4dd9-ab5f-03768eacea7a\") " Apr 22 19:29:12.181743 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181629 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "589f5379-60ee-4dd9-ab5f-03768eacea7a" (UID: "589f5379-60ee-4dd9-ab5f-03768eacea7a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:29:12.181898 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181868 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-config" (OuterVolumeSpecName: "console-config") pod "589f5379-60ee-4dd9-ab5f-03768eacea7a" (UID: "589f5379-60ee-4dd9-ab5f-03768eacea7a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:29:12.181962 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181872 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "589f5379-60ee-4dd9-ab5f-03768eacea7a" (UID: "589f5379-60ee-4dd9-ab5f-03768eacea7a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:29:12.181962 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.181902 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-service-ca" (OuterVolumeSpecName: "service-ca") pod "589f5379-60ee-4dd9-ab5f-03768eacea7a" (UID: "589f5379-60ee-4dd9-ab5f-03768eacea7a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:29:12.183508 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.183484 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "589f5379-60ee-4dd9-ab5f-03768eacea7a" (UID: "589f5379-60ee-4dd9-ab5f-03768eacea7a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:29:12.183912 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.183893 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "589f5379-60ee-4dd9-ab5f-03768eacea7a" (UID: "589f5379-60ee-4dd9-ab5f-03768eacea7a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:29:12.183978 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.183913 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589f5379-60ee-4dd9-ab5f-03768eacea7a-kube-api-access-xnp2s" (OuterVolumeSpecName: "kube-api-access-xnp2s") pod "589f5379-60ee-4dd9-ab5f-03768eacea7a" (UID: "589f5379-60ee-4dd9-ab5f-03768eacea7a"). InnerVolumeSpecName "kube-api-access-xnp2s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:29:12.282562 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.282513 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-trusted-ca-bundle\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:29:12.282562 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.282555 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-config\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:29:12.282562 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.282567 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-service-ca\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:29:12.282562 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.282596 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/589f5379-60ee-4dd9-ab5f-03768eacea7a-oauth-serving-cert\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:29:12.282562 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.282605 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-serving-cert\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:29:12.282874 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.282616 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnp2s\" (UniqueName: \"kubernetes.io/projected/589f5379-60ee-4dd9-ab5f-03768eacea7a-kube-api-access-xnp2s\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:29:12.282874 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.282625 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/589f5379-60ee-4dd9-ab5f-03768eacea7a-console-oauth-config\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:29:12.767391 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.767361 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c9d8c7b5f-54bd5_589f5379-60ee-4dd9-ab5f-03768eacea7a/console/0.log" Apr 22 19:29:12.767591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.767403 2570 generic.go:358] "Generic (PLEG): container finished" podID="589f5379-60ee-4dd9-ab5f-03768eacea7a" containerID="ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9" exitCode=2 Apr 22 19:29:12.767591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.767480 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c9d8c7b5f-54bd5" event={"ID":"589f5379-60ee-4dd9-ab5f-03768eacea7a","Type":"ContainerDied","Data":"ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9"} Apr 22 19:29:12.767591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.767494 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c9d8c7b5f-54bd5" Apr 22 19:29:12.767591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.767507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c9d8c7b5f-54bd5" event={"ID":"589f5379-60ee-4dd9-ab5f-03768eacea7a","Type":"ContainerDied","Data":"5c3ed277c21d73447161ed3c2ba2665f2526cb71e5d6835d0810477fbce26134"} Apr 22 19:29:12.767591 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.767521 2570 scope.go:117] "RemoveContainer" containerID="ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9" Apr 22 19:29:12.775715 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.775697 2570 scope.go:117] "RemoveContainer" containerID="ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9" Apr 22 19:29:12.775987 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:29:12.775968 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9\": container with ID starting with ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9 not found: ID does not exist" containerID="ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9" Apr 22 19:29:12.776033 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.775994 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9"} err="failed to get container status \"ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9\": rpc error: code = NotFound desc = could not find container \"ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9\": container with ID starting with ca9182e32dba2fb6b1d1c5ca3f5beaf56d5655a1da3a768ff2c55bce40f872f9 not found: ID does not exist" Apr 22 19:29:12.788708 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.788679 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c9d8c7b5f-54bd5"] Apr 22 19:29:12.792923 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:12.792873 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c9d8c7b5f-54bd5"] Apr 22 19:29:13.595026 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:13.594993 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589f5379-60ee-4dd9-ab5f-03768eacea7a" path="/var/lib/kubelet/pods/589f5379-60ee-4dd9-ab5f-03768eacea7a/volumes" Apr 22 19:29:31.541745 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.541708 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hvh5z"] Apr 22 19:29:31.542108 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.541966 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5292ef93-da54-4383-bd75-5e3fda0bf670" containerName="console" Apr 22 19:29:31.542108 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.541978 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5292ef93-da54-4383-bd75-5e3fda0bf670" containerName="console" Apr 22 19:29:31.542108 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.541996 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="589f5379-60ee-4dd9-ab5f-03768eacea7a" containerName="console" Apr 22 19:29:31.542108 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.542001 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="589f5379-60ee-4dd9-ab5f-03768eacea7a" containerName="console" Apr 22 19:29:31.542108 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.542042 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="589f5379-60ee-4dd9-ab5f-03768eacea7a" containerName="console" Apr 22 19:29:31.542108 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.542051 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5292ef93-da54-4383-bd75-5e3fda0bf670" containerName="console" Apr 22 19:29:31.545052 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.545036 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.551072 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.551052 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:29:31.557527 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.557504 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hvh5z"] Apr 22 19:29:31.620959 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.620927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1d48cabd-e6ed-491b-9c22-028c42616dca-kubelet-config\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.620959 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.620975 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1d48cabd-e6ed-491b-9c22-028c42616dca-original-pull-secret\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.621182 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.621049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1d48cabd-e6ed-491b-9c22-028c42616dca-dbus\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.722242 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.722194 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1d48cabd-e6ed-491b-9c22-028c42616dca-kubelet-config\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.722396 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.722258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1d48cabd-e6ed-491b-9c22-028c42616dca-original-pull-secret\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.722396 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.722295 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1d48cabd-e6ed-491b-9c22-028c42616dca-dbus\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.722396 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.722344 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1d48cabd-e6ed-491b-9c22-028c42616dca-kubelet-config\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.722536 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.722481 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1d48cabd-e6ed-491b-9c22-028c42616dca-dbus\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.724563 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.724530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1d48cabd-e6ed-491b-9c22-028c42616dca-original-pull-secret\") pod \"global-pull-secret-syncer-hvh5z\" (UID: \"1d48cabd-e6ed-491b-9c22-028c42616dca\") " pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.854362 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.854275 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hvh5z" Apr 22 19:29:31.973512 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.973480 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hvh5z"] Apr 22 19:29:31.976671 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:29:31.976636 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d48cabd_e6ed_491b_9c22_028c42616dca.slice/crio-a89520396f0f6b403631812bf58b188725752de3ea13b6c174a6abfc157b9ba6 WatchSource:0}: Error finding container a89520396f0f6b403631812bf58b188725752de3ea13b6c174a6abfc157b9ba6: Status 404 returned error can't find the container with id a89520396f0f6b403631812bf58b188725752de3ea13b6c174a6abfc157b9ba6 Apr 22 19:29:31.978199 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:31.978181 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:29:32.816762 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:32.816719 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hvh5z" event={"ID":"1d48cabd-e6ed-491b-9c22-028c42616dca","Type":"ContainerStarted","Data":"a89520396f0f6b403631812bf58b188725752de3ea13b6c174a6abfc157b9ba6"} Apr 22 19:29:36.828340 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:36.828300 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hvh5z" event={"ID":"1d48cabd-e6ed-491b-9c22-028c42616dca","Type":"ContainerStarted","Data":"d3f69db3c940c2f98ad9536a7051b68540d91e6a22be9e9f6f5eb8544b71f8ee"} Apr 22 19:29:36.845452 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:29:36.845400 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hvh5z" podStartSLOduration=1.858730655 podStartE2EDuration="5.845386951s" podCreationTimestamp="2026-04-22 19:29:31 +0000 UTC" firstStartedPulling="2026-04-22 19:29:31.978311978 +0000 UTC m=+361.028675449" lastFinishedPulling="2026-04-22 19:29:35.964968273 +0000 UTC m=+365.015331745" observedRunningTime="2026-04-22 19:29:36.84420611 +0000 UTC m=+365.894569600" watchObservedRunningTime="2026-04-22 19:29:36.845386951 +0000 UTC m=+365.895750440" Apr 22 19:30:24.405738 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.405707 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl"] Apr 22 19:30:24.408793 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.408775 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.411338 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.411313 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:30:24.412399 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.412373 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:30:24.412520 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.412414 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dm6gf\"" Apr 22 19:30:24.418924 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.418898 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl"] Apr 22 19:30:24.499564 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.499521 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdn2k\" (UniqueName: \"kubernetes.io/projected/23ebb35c-b5d9-4de4-a35a-1c9597baee07-kube-api-access-qdn2k\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.499775 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.499596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.499775 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.499620 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.600679 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.600638 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdn2k\" (UniqueName: \"kubernetes.io/projected/23ebb35c-b5d9-4de4-a35a-1c9597baee07-kube-api-access-qdn2k\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.600845 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.600693 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.600845 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.600735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.601121 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.601097 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.601184 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.601134 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.612808 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.612786 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdn2k\" (UniqueName: \"kubernetes.io/projected/23ebb35c-b5d9-4de4-a35a-1c9597baee07-kube-api-access-qdn2k\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.717874 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.717780 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:24.837663 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.837630 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl"] Apr 22 19:30:24.841707 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:30:24.841675 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23ebb35c_b5d9_4de4_a35a_1c9597baee07.slice/crio-4646387c6db02dedbd5eed3263b3a834e71dea1eebcb3e97738c46d099797102 WatchSource:0}: Error finding container 4646387c6db02dedbd5eed3263b3a834e71dea1eebcb3e97738c46d099797102: Status 404 returned error can't find the container with id 4646387c6db02dedbd5eed3263b3a834e71dea1eebcb3e97738c46d099797102 Apr 22 19:30:24.955850 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:24.955789 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" event={"ID":"23ebb35c-b5d9-4de4-a35a-1c9597baee07","Type":"ContainerStarted","Data":"4646387c6db02dedbd5eed3263b3a834e71dea1eebcb3e97738c46d099797102"} Apr 22 19:30:32.976397 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:32.976363 2570 generic.go:358] "Generic (PLEG): container finished" podID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerID="67e485817939fcf1bb9c77941f70048f322ba41dd795b7f89ffe27e4caaf230b" exitCode=0 Apr 22 19:30:32.976822 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:32.976468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" event={"ID":"23ebb35c-b5d9-4de4-a35a-1c9597baee07","Type":"ContainerDied","Data":"67e485817939fcf1bb9c77941f70048f322ba41dd795b7f89ffe27e4caaf230b"} Apr 22 19:30:35.985371 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:35.985335 2570 generic.go:358] "Generic (PLEG): container finished" podID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerID="8824627529c9b8e8e2f1850b11ee8c09f9b837e64f3a788733647bff5c2ced99" exitCode=0 Apr 22 19:30:35.985780 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:35.985414 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" event={"ID":"23ebb35c-b5d9-4de4-a35a-1c9597baee07","Type":"ContainerDied","Data":"8824627529c9b8e8e2f1850b11ee8c09f9b837e64f3a788733647bff5c2ced99"} Apr 22 19:30:43.007730 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:43.007698 2570 generic.go:358] "Generic (PLEG): container finished" podID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerID="d9a9f44980d23a4a24efbf94c792963c9d99252c5a4c961a13cc47459d02a499" exitCode=0 Apr 22 19:30:43.008139 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:43.007785 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" event={"ID":"23ebb35c-b5d9-4de4-a35a-1c9597baee07","Type":"ContainerDied","Data":"d9a9f44980d23a4a24efbf94c792963c9d99252c5a4c961a13cc47459d02a499"} Apr 22 19:30:44.126802 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.126777 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:44.254007 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.253964 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-bundle\") pod \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " Apr 22 19:30:44.254197 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.254022 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-util\") pod \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " Apr 22 19:30:44.254197 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.254052 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdn2k\" (UniqueName: \"kubernetes.io/projected/23ebb35c-b5d9-4de4-a35a-1c9597baee07-kube-api-access-qdn2k\") pod \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\" (UID: \"23ebb35c-b5d9-4de4-a35a-1c9597baee07\") " Apr 22 19:30:44.254561 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.254530 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-bundle" (OuterVolumeSpecName: "bundle") pod "23ebb35c-b5d9-4de4-a35a-1c9597baee07" (UID: "23ebb35c-b5d9-4de4-a35a-1c9597baee07"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:30:44.256270 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.256247 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ebb35c-b5d9-4de4-a35a-1c9597baee07-kube-api-access-qdn2k" (OuterVolumeSpecName: "kube-api-access-qdn2k") pod "23ebb35c-b5d9-4de4-a35a-1c9597baee07" (UID: "23ebb35c-b5d9-4de4-a35a-1c9597baee07"). InnerVolumeSpecName "kube-api-access-qdn2k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:30:44.259599 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.259551 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-util" (OuterVolumeSpecName: "util") pod "23ebb35c-b5d9-4de4-a35a-1c9597baee07" (UID: "23ebb35c-b5d9-4de4-a35a-1c9597baee07"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:30:44.355297 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.355212 2570 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-bundle\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:30:44.355297 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.355240 2570 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23ebb35c-b5d9-4de4-a35a-1c9597baee07-util\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:30:44.355297 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:44.355250 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdn2k\" (UniqueName: \"kubernetes.io/projected/23ebb35c-b5d9-4de4-a35a-1c9597baee07-kube-api-access-qdn2k\") on node \"ip-10-0-131-132.ec2.internal\" DevicePath \"\"" Apr 22 19:30:45.016189 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:45.016159 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" event={"ID":"23ebb35c-b5d9-4de4-a35a-1c9597baee07","Type":"ContainerDied","Data":"4646387c6db02dedbd5eed3263b3a834e71dea1eebcb3e97738c46d099797102"} Apr 22 19:30:45.016189 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:45.016190 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4646387c6db02dedbd5eed3263b3a834e71dea1eebcb3e97738c46d099797102" Apr 22 19:30:45.016404 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:45.016204 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjh2fl" Apr 22 19:30:51.395108 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.395031 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c"] Apr 22 19:30:51.395485 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.395266 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerName="util" Apr 22 19:30:51.395485 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.395276 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerName="util" Apr 22 19:30:51.395485 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.395287 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerName="pull" Apr 22 19:30:51.395485 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.395293 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerName="pull" Apr 22 19:30:51.395485 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.395301 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerName="extract" Apr 22 19:30:51.395485 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.395307 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerName="extract" Apr 22 19:30:51.395485 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.395349 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="23ebb35c-b5d9-4de4-a35a-1c9597baee07" containerName="extract" Apr 22 19:30:51.398762 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.398734 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:51.402455 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.402427 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 19:30:51.402455 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.402447 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-scwzj\"" Apr 22 19:30:51.403025 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.403008 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 19:30:51.403498 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.403475 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 19:30:51.409419 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.409395 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c"] Apr 22 19:30:51.512105 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.512071 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a0548002-b228-40f6-af7b-2fa4976cc949-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c\" (UID: \"a0548002-b228-40f6-af7b-2fa4976cc949\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:51.512272 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.512120 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xprl7\" (UniqueName: \"kubernetes.io/projected/a0548002-b228-40f6-af7b-2fa4976cc949-kube-api-access-xprl7\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c\" (UID: \"a0548002-b228-40f6-af7b-2fa4976cc949\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:51.613285 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.613254 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a0548002-b228-40f6-af7b-2fa4976cc949-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c\" (UID: \"a0548002-b228-40f6-af7b-2fa4976cc949\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:51.613456 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.613302 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xprl7\" (UniqueName: \"kubernetes.io/projected/a0548002-b228-40f6-af7b-2fa4976cc949-kube-api-access-xprl7\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c\" (UID: \"a0548002-b228-40f6-af7b-2fa4976cc949\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:51.615588 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.615548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a0548002-b228-40f6-af7b-2fa4976cc949-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c\" (UID: \"a0548002-b228-40f6-af7b-2fa4976cc949\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:51.622607 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.622561 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xprl7\" (UniqueName: \"kubernetes.io/projected/a0548002-b228-40f6-af7b-2fa4976cc949-kube-api-access-xprl7\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c\" (UID: \"a0548002-b228-40f6-af7b-2fa4976cc949\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:51.710805 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.710714 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:51.832330 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:51.832297 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c"] Apr 22 19:30:51.835364 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:30:51.835335 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0548002_b228_40f6_af7b_2fa4976cc949.slice/crio-d94bbd5d67540df2f62ec13bb6ae25f550103c766e44115bf4d566c0b55a96c7 WatchSource:0}: Error finding container d94bbd5d67540df2f62ec13bb6ae25f550103c766e44115bf4d566c0b55a96c7: Status 404 returned error can't find the container with id d94bbd5d67540df2f62ec13bb6ae25f550103c766e44115bf4d566c0b55a96c7 Apr 22 19:30:52.036749 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:52.036711 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" event={"ID":"a0548002-b228-40f6-af7b-2fa4976cc949","Type":"ContainerStarted","Data":"d94bbd5d67540df2f62ec13bb6ae25f550103c766e44115bf4d566c0b55a96c7"} Apr 22 19:30:55.774427 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.774395 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ltjww"] Apr 22 19:30:55.777361 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.777342 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:55.779948 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.779925 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 19:30:55.780070 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.779972 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 19:30:55.780070 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.779985 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-n4kw2\"" Apr 22 19:30:55.786309 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.786283 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ltjww"] Apr 22 19:30:55.845951 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.845916 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-cabundle0\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:55.846111 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.846003 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65vx6\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-kube-api-access-65vx6\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:55.846111 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.846028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:55.946644 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.946605 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65vx6\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-kube-api-access-65vx6\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:55.946644 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.946654 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:55.946877 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.946702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-cabundle0\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:55.946877 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:55.946823 2570 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:30:55.946877 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:55.946841 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:30:55.946877 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:55.946851 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ltjww: references non-existent secret key: ca.crt Apr 22 19:30:55.947023 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:55.946911 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates podName:baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40 nodeName:}" failed. No retries permitted until 2026-04-22 19:30:56.446890602 +0000 UTC m=+445.497254078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates") pod "keda-operator-ffbb595cb-ltjww" (UID: "baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40") : references non-existent secret key: ca.crt Apr 22 19:30:55.947385 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.947367 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-cabundle0\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:55.961258 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:55.961227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65vx6\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-kube-api-access-65vx6\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:56.049529 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.049435 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" event={"ID":"a0548002-b228-40f6-af7b-2fa4976cc949","Type":"ContainerStarted","Data":"d5454f4a5cc36ae56a29b86f842b21bdcc87162468d9c78afa09f41d18068c21"} Apr 22 19:30:56.049698 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.049552 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:30:56.070799 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.070746 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" podStartSLOduration=1.629047565 podStartE2EDuration="5.070731109s" podCreationTimestamp="2026-04-22 19:30:51 +0000 UTC" firstStartedPulling="2026-04-22 19:30:51.837260395 +0000 UTC m=+440.887623877" lastFinishedPulling="2026-04-22 19:30:55.278943952 +0000 UTC m=+444.329307421" observedRunningTime="2026-04-22 19:30:56.068664581 +0000 UTC m=+445.119028071" watchObservedRunningTime="2026-04-22 19:30:56.070731109 +0000 UTC m=+445.121094598" Apr 22 19:30:56.206124 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.206090 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc"] Apr 22 19:30:56.209137 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.209120 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.212966 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.212937 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 19:30:56.221057 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.221031 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc"] Apr 22 19:30:56.349594 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.349500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.349594 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.349540 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.349764 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.349634 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg8m8\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-kube-api-access-mg8m8\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.451014 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.450970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.451014 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.451014 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.451229 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.451046 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg8m8\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-kube-api-access-mg8m8\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.451229 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.451083 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:56.451229 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.451142 2570 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:30:56.451229 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.451166 2570 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:30:56.451229 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.451178 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:30:56.451229 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.451186 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ltjww: references non-existent secret key: ca.crt Apr 22 19:30:56.451229 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.451232 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates podName:baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40 nodeName:}" failed. No retries permitted until 2026-04-22 19:30:57.451215955 +0000 UTC m=+446.501579423 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates") pod "keda-operator-ffbb595cb-ltjww" (UID: "baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40") : references non-existent secret key: ca.crt Apr 22 19:30:56.451466 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.451166 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:30:56.451466 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.451271 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc: references non-existent secret key: tls.crt Apr 22 19:30:56.451466 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.451323 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates podName:f84e9609-ae24-4810-b7a3-e514a6ef4ccd nodeName:}" failed. No retries permitted until 2026-04-22 19:30:56.95130641 +0000 UTC m=+446.001669879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates") pod "keda-metrics-apiserver-7c9f485588-87zlc" (UID: "f84e9609-ae24-4810-b7a3-e514a6ef4ccd") : references non-existent secret key: tls.crt Apr 22 19:30:56.451466 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.451401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.459895 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.459870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg8m8\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-kube-api-access-mg8m8\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.538215 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.538179 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-k968b"] Apr 22 19:30:56.541270 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.541250 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:56.543932 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.543911 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 19:30:56.554767 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.554747 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-k968b"] Apr 22 19:30:56.653350 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.653254 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhf9\" (UniqueName: \"kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-kube-api-access-zlhf9\") pod \"keda-admission-cf49989db-k968b\" (UID: \"65a518cc-3e70-4fca-ab95-247ced9d6199\") " pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:56.653350 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.653298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-certificates\") pod \"keda-admission-cf49989db-k968b\" (UID: \"65a518cc-3e70-4fca-ab95-247ced9d6199\") " pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:56.754588 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.754536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhf9\" (UniqueName: \"kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-kube-api-access-zlhf9\") pod \"keda-admission-cf49989db-k968b\" (UID: \"65a518cc-3e70-4fca-ab95-247ced9d6199\") " pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:56.754805 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.754606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-certificates\") pod \"keda-admission-cf49989db-k968b\" (UID: \"65a518cc-3e70-4fca-ab95-247ced9d6199\") " pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:56.754805 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.754738 2570 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 19:30:56.754805 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.754761 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-k968b: secret "keda-admission-webhooks-certs" not found Apr 22 19:30:56.754960 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.754822 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-certificates podName:65a518cc-3e70-4fca-ab95-247ced9d6199 nodeName:}" failed. No retries permitted until 2026-04-22 19:30:57.254802613 +0000 UTC m=+446.305166084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-certificates") pod "keda-admission-cf49989db-k968b" (UID: "65a518cc-3e70-4fca-ab95-247ced9d6199") : secret "keda-admission-webhooks-certs" not found Apr 22 19:30:56.763674 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.763641 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhf9\" (UniqueName: \"kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-kube-api-access-zlhf9\") pod \"keda-admission-cf49989db-k968b\" (UID: \"65a518cc-3e70-4fca-ab95-247ced9d6199\") " pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:56.957663 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:56.957290 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:56.957663 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.957443 2570 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:30:56.957663 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.957458 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:30:56.957663 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.957479 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc: references non-existent secret key: tls.crt Apr 22 19:30:56.957663 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:56.957534 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates podName:f84e9609-ae24-4810-b7a3-e514a6ef4ccd nodeName:}" failed. No retries permitted until 2026-04-22 19:30:57.957515894 +0000 UTC m=+447.007879372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates") pod "keda-metrics-apiserver-7c9f485588-87zlc" (UID: "f84e9609-ae24-4810-b7a3-e514a6ef4ccd") : references non-existent secret key: tls.crt Apr 22 19:30:57.260214 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:57.260175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-certificates\") pod \"keda-admission-cf49989db-k968b\" (UID: \"65a518cc-3e70-4fca-ab95-247ced9d6199\") " pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:57.262697 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:57.262660 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65a518cc-3e70-4fca-ab95-247ced9d6199-certificates\") pod \"keda-admission-cf49989db-k968b\" (UID: \"65a518cc-3e70-4fca-ab95-247ced9d6199\") " pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:57.452047 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:57.452009 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:30:57.462092 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:57.462058 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:57.462227 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:57.462218 2570 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:30:57.462283 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:57.462233 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:30:57.462283 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:57.462244 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ltjww: references non-existent secret key: ca.crt Apr 22 19:30:57.462383 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:57.462296 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates podName:baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40 nodeName:}" failed. No retries permitted until 2026-04-22 19:30:59.462279458 +0000 UTC m=+448.512642940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates") pod "keda-operator-ffbb595cb-ltjww" (UID: "baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40") : references non-existent secret key: ca.crt Apr 22 19:30:57.577672 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:57.577635 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-k968b"] Apr 22 19:30:57.581152 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:30:57.581120 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a518cc_3e70_4fca_ab95_247ced9d6199.slice/crio-f5fe5570d1741d891f6738d0b8ace32dd71f69c617f930d184be11e128cc30f4 WatchSource:0}: Error finding container f5fe5570d1741d891f6738d0b8ace32dd71f69c617f930d184be11e128cc30f4: Status 404 returned error can't find the container with id f5fe5570d1741d891f6738d0b8ace32dd71f69c617f930d184be11e128cc30f4 Apr 22 19:30:57.966181 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:57.966087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:57.966652 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:57.966248 2570 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:30:57.966652 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:57.966270 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:30:57.966652 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:57.966295 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc: references non-existent secret key: tls.crt Apr 22 19:30:57.966652 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:30:57.966359 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates podName:f84e9609-ae24-4810-b7a3-e514a6ef4ccd nodeName:}" failed. No retries permitted until 2026-04-22 19:30:59.966339958 +0000 UTC m=+449.016703428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates") pod "keda-metrics-apiserver-7c9f485588-87zlc" (UID: "f84e9609-ae24-4810-b7a3-e514a6ef4ccd") : references non-existent secret key: tls.crt Apr 22 19:30:58.056715 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:58.056681 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-k968b" event={"ID":"65a518cc-3e70-4fca-ab95-247ced9d6199","Type":"ContainerStarted","Data":"f5fe5570d1741d891f6738d0b8ace32dd71f69c617f930d184be11e128cc30f4"} Apr 22 19:30:59.479673 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:59.479627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:59.482076 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:59.482054 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40-certificates\") pod \"keda-operator-ffbb595cb-ltjww\" (UID: \"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40\") " pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:59.687652 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:59.687567 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:30:59.808193 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:59.808158 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ltjww"] Apr 22 19:30:59.810937 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:30:59.810910 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaab2cc0_8e9c_4606_ae1d_5e07e0a4ec40.slice/crio-aab8469ae68481be11f0f2b07580e397cd3b8079e5587688ce81bf81eb06263a WatchSource:0}: Error finding container aab8469ae68481be11f0f2b07580e397cd3b8079e5587688ce81bf81eb06263a: Status 404 returned error can't find the container with id aab8469ae68481be11f0f2b07580e397cd3b8079e5587688ce81bf81eb06263a Apr 22 19:30:59.983401 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:59.983374 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:30:59.985825 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:30:59.985764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f84e9609-ae24-4810-b7a3-e514a6ef4ccd-certificates\") pod \"keda-metrics-apiserver-7c9f485588-87zlc\" (UID: \"f84e9609-ae24-4810-b7a3-e514a6ef4ccd\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:31:00.063882 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:00.063839 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-k968b" event={"ID":"65a518cc-3e70-4fca-ab95-247ced9d6199","Type":"ContainerStarted","Data":"6d7dd143de238691ae8c6e88b8d76681f0ffa48db6f8d2de7407cb7b5feabaf0"} Apr 22 19:31:00.064076 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:00.063965 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:31:00.064919 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:00.064897 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ltjww" event={"ID":"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40","Type":"ContainerStarted","Data":"aab8469ae68481be11f0f2b07580e397cd3b8079e5587688ce81bf81eb06263a"} Apr 22 19:31:00.081652 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:00.081598 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-k968b" podStartSLOduration=2.521636 podStartE2EDuration="4.08156477s" podCreationTimestamp="2026-04-22 19:30:56 +0000 UTC" firstStartedPulling="2026-04-22 19:30:57.582689439 +0000 UTC m=+446.633052921" lastFinishedPulling="2026-04-22 19:30:59.142618223 +0000 UTC m=+448.192981691" observedRunningTime="2026-04-22 19:31:00.07961025 +0000 UTC m=+449.129973778" watchObservedRunningTime="2026-04-22 19:31:00.08156477 +0000 UTC m=+449.131928283" Apr 22 19:31:00.122712 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:00.122679 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:31:00.244623 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:00.244236 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc"] Apr 22 19:31:00.246858 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:31:00.246828 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf84e9609_ae24_4810_b7a3_e514a6ef4ccd.slice/crio-23927f9b80a74ce741e12612c5579dde2ef77f6835aa0d298dfeefb0765a3094 WatchSource:0}: Error finding container 23927f9b80a74ce741e12612c5579dde2ef77f6835aa0d298dfeefb0765a3094: Status 404 returned error can't find the container with id 23927f9b80a74ce741e12612c5579dde2ef77f6835aa0d298dfeefb0765a3094 Apr 22 19:31:01.073453 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:01.073403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" event={"ID":"f84e9609-ae24-4810-b7a3-e514a6ef4ccd","Type":"ContainerStarted","Data":"23927f9b80a74ce741e12612c5579dde2ef77f6835aa0d298dfeefb0765a3094"} Apr 22 19:31:04.083604 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:04.083469 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" event={"ID":"f84e9609-ae24-4810-b7a3-e514a6ef4ccd","Type":"ContainerStarted","Data":"f739a19b0a715752815bb50d1937f11dc708d0c84a99bcc2430c664422ef8b49"} Apr 22 19:31:04.083604 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:04.083585 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:31:04.084957 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:04.084910 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ltjww" event={"ID":"baab2cc0-8e9c-4606-ae1d-5e07e0a4ec40","Type":"ContainerStarted","Data":"9e857517a0d50c2365556faecc6357fa0f78930d2c85d0cbeba3d860de3de8a0"} Apr 22 19:31:04.085098 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:04.085075 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:31:04.100213 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:04.100165 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" podStartSLOduration=4.54309764 podStartE2EDuration="8.100150726s" podCreationTimestamp="2026-04-22 19:30:56 +0000 UTC" firstStartedPulling="2026-04-22 19:31:00.248174452 +0000 UTC m=+449.298537934" lastFinishedPulling="2026-04-22 19:31:03.805227538 +0000 UTC m=+452.855591020" observedRunningTime="2026-04-22 19:31:04.09946973 +0000 UTC m=+453.149833219" watchObservedRunningTime="2026-04-22 19:31:04.100150726 +0000 UTC m=+453.150514215" Apr 22 19:31:04.118505 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:04.118443 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-ltjww" podStartSLOduration=5.119360476 podStartE2EDuration="9.118424255s" podCreationTimestamp="2026-04-22 19:30:55 +0000 UTC" firstStartedPulling="2026-04-22 19:30:59.812333409 +0000 UTC m=+448.862696877" lastFinishedPulling="2026-04-22 19:31:03.811397183 +0000 UTC m=+452.861760656" observedRunningTime="2026-04-22 19:31:04.11804906 +0000 UTC m=+453.168412551" watchObservedRunningTime="2026-04-22 19:31:04.118424255 +0000 UTC m=+453.168787746" Apr 22 19:31:15.092539 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:15.092510 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-87zlc" Apr 22 19:31:17.055514 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:17.055479 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvj7c" Apr 22 19:31:21.075652 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:21.075617 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-k968b" Apr 22 19:31:25.090214 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:31:25.090182 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-ltjww" Apr 22 19:32:03.723786 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.723754 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x65ph"] Apr 22 19:32:03.726861 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.726837 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:03.729423 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.729400 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 19:32:03.729557 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.729455 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:32:03.730688 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.730665 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-n54hp\"" Apr 22 19:32:03.730754 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.730672 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:32:03.736302 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.736277 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x65ph"] Apr 22 19:32:03.741897 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.741875 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3210e907-7781-4722-a0c7-86c7c08d668a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x65ph\" (UID: \"3210e907-7781-4722-a0c7-86c7c08d668a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:03.742001 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.741907 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-886s2\" (UniqueName: \"kubernetes.io/projected/3210e907-7781-4722-a0c7-86c7c08d668a-kube-api-access-886s2\") pod \"llmisvc-controller-manager-68cc5db7c4-x65ph\" (UID: \"3210e907-7781-4722-a0c7-86c7c08d668a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:03.842532 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.842499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3210e907-7781-4722-a0c7-86c7c08d668a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x65ph\" (UID: \"3210e907-7781-4722-a0c7-86c7c08d668a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:03.842737 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.842540 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-886s2\" (UniqueName: \"kubernetes.io/projected/3210e907-7781-4722-a0c7-86c7c08d668a-kube-api-access-886s2\") pod \"llmisvc-controller-manager-68cc5db7c4-x65ph\" (UID: \"3210e907-7781-4722-a0c7-86c7c08d668a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:03.842737 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:32:03.842681 2570 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 19:32:03.842809 ip-10-0-131-132 kubenswrapper[2570]: E0422 19:32:03.842754 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3210e907-7781-4722-a0c7-86c7c08d668a-cert podName:3210e907-7781-4722-a0c7-86c7c08d668a nodeName:}" failed. No retries permitted until 2026-04-22 19:32:04.342738551 +0000 UTC m=+513.393102023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3210e907-7781-4722-a0c7-86c7c08d668a-cert") pod "llmisvc-controller-manager-68cc5db7c4-x65ph" (UID: "3210e907-7781-4722-a0c7-86c7c08d668a") : secret "llmisvc-webhook-server-cert" not found Apr 22 19:32:03.851469 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:03.851441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-886s2\" (UniqueName: \"kubernetes.io/projected/3210e907-7781-4722-a0c7-86c7c08d668a-kube-api-access-886s2\") pod \"llmisvc-controller-manager-68cc5db7c4-x65ph\" (UID: \"3210e907-7781-4722-a0c7-86c7c08d668a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:04.345551 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:04.345512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3210e907-7781-4722-a0c7-86c7c08d668a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x65ph\" (UID: \"3210e907-7781-4722-a0c7-86c7c08d668a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:04.347985 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:04.347952 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3210e907-7781-4722-a0c7-86c7c08d668a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x65ph\" (UID: \"3210e907-7781-4722-a0c7-86c7c08d668a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:04.637778 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:04.637674 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:04.759185 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:04.759143 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x65ph"] Apr 22 19:32:04.762066 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:32:04.762030 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3210e907_7781_4722_a0c7_86c7c08d668a.slice/crio-15f9879914379908970f2adb8eb33c1d3d2303d0266e99c4c380a2c72ffcc87d WatchSource:0}: Error finding container 15f9879914379908970f2adb8eb33c1d3d2303d0266e99c4c380a2c72ffcc87d: Status 404 returned error can't find the container with id 15f9879914379908970f2adb8eb33c1d3d2303d0266e99c4c380a2c72ffcc87d Apr 22 19:32:05.249551 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:05.249517 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" event={"ID":"3210e907-7781-4722-a0c7-86c7c08d668a","Type":"ContainerStarted","Data":"15f9879914379908970f2adb8eb33c1d3d2303d0266e99c4c380a2c72ffcc87d"} Apr 22 19:32:07.257181 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:07.257150 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" event={"ID":"3210e907-7781-4722-a0c7-86c7c08d668a","Type":"ContainerStarted","Data":"f373c7f30bbec8dd1dc6825c387229c5a7b21f0b798b17f2de524ff0be181de0"} Apr 22 19:32:07.257640 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:07.257252 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:32:07.274182 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:07.274124 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" podStartSLOduration=2.440495436 podStartE2EDuration="4.274107322s" podCreationTimestamp="2026-04-22 19:32:03 +0000 UTC" firstStartedPulling="2026-04-22 19:32:04.763222346 +0000 UTC m=+513.813585817" lastFinishedPulling="2026-04-22 19:32:06.59683422 +0000 UTC m=+515.647197703" observedRunningTime="2026-04-22 19:32:07.272991086 +0000 UTC m=+516.323354573" watchObservedRunningTime="2026-04-22 19:32:07.274107322 +0000 UTC m=+516.324470815" Apr 22 19:32:38.262335 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:32:38.262302 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x65ph" Apr 22 19:33:13.368494 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.368451 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-qntmp"] Apr 22 19:33:13.371582 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.371553 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:13.374387 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.374362 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 19:33:13.374525 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.374397 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-mmghp\"" Apr 22 19:33:13.384217 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.384190 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-qntmp"] Apr 22 19:33:13.429891 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.429856 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89m4w\" (UniqueName: \"kubernetes.io/projected/8b297674-2f6d-487a-a011-5a60de622123-kube-api-access-89m4w\") pod \"odh-model-controller-696fc77849-qntmp\" (UID: \"8b297674-2f6d-487a-a011-5a60de622123\") " pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:13.430084 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.429918 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b297674-2f6d-487a-a011-5a60de622123-cert\") pod \"odh-model-controller-696fc77849-qntmp\" (UID: \"8b297674-2f6d-487a-a011-5a60de622123\") " pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:13.530981 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.530942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b297674-2f6d-487a-a011-5a60de622123-cert\") pod \"odh-model-controller-696fc77849-qntmp\" (UID: \"8b297674-2f6d-487a-a011-5a60de622123\") " pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:13.531148 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.530994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89m4w\" (UniqueName: \"kubernetes.io/projected/8b297674-2f6d-487a-a011-5a60de622123-kube-api-access-89m4w\") pod \"odh-model-controller-696fc77849-qntmp\" (UID: \"8b297674-2f6d-487a-a011-5a60de622123\") " pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:13.533381 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.533358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b297674-2f6d-487a-a011-5a60de622123-cert\") pod \"odh-model-controller-696fc77849-qntmp\" (UID: \"8b297674-2f6d-487a-a011-5a60de622123\") " pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:13.542133 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.542103 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89m4w\" (UniqueName: \"kubernetes.io/projected/8b297674-2f6d-487a-a011-5a60de622123-kube-api-access-89m4w\") pod \"odh-model-controller-696fc77849-qntmp\" (UID: \"8b297674-2f6d-487a-a011-5a60de622123\") " pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:13.681913 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.681825 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:13.802920 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:13.802886 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-qntmp"] Apr 22 19:33:13.806335 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:33:13.806304 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b297674_2f6d_487a_a011_5a60de622123.slice/crio-dd459cc0eb4308f12b8e69e096cd6e3c57dfdd21c01ebef24ffd506fa70afb27 WatchSource:0}: Error finding container dd459cc0eb4308f12b8e69e096cd6e3c57dfdd21c01ebef24ffd506fa70afb27: Status 404 returned error can't find the container with id dd459cc0eb4308f12b8e69e096cd6e3c57dfdd21c01ebef24ffd506fa70afb27 Apr 22 19:33:14.445959 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:14.445922 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-qntmp" event={"ID":"8b297674-2f6d-487a-a011-5a60de622123","Type":"ContainerStarted","Data":"dd459cc0eb4308f12b8e69e096cd6e3c57dfdd21c01ebef24ffd506fa70afb27"} Apr 22 19:33:17.456802 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:17.456763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-qntmp" event={"ID":"8b297674-2f6d-487a-a011-5a60de622123","Type":"ContainerStarted","Data":"8e48d31e968cdd2587bae7be199cd00d14de6c2876e34921269753001fe59c3f"} Apr 22 19:33:17.457275 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:17.456882 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:17.474839 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:17.474787 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-qntmp" podStartSLOduration=1.83970346 podStartE2EDuration="4.474771471s" podCreationTimestamp="2026-04-22 19:33:13 +0000 UTC" firstStartedPulling="2026-04-22 19:33:13.807894313 +0000 UTC m=+582.858257798" lastFinishedPulling="2026-04-22 19:33:16.442962341 +0000 UTC m=+585.493325809" observedRunningTime="2026-04-22 19:33:17.473585488 +0000 UTC m=+586.523948975" watchObservedRunningTime="2026-04-22 19:33:17.474771471 +0000 UTC m=+586.525134962" Apr 22 19:33:28.461489 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:28.461452 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-qntmp" Apr 22 19:33:31.494586 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:31.494542 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:33:31.495053 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:33:31.495010 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:38:31.513779 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:38:31.513739 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:38:31.515164 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:38:31.515138 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:43:31.536364 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:43:31.536336 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:43:31.538084 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:43:31.538062 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:48:15.744686 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:15.744648 2570 ???:1] "http: TLS handshake error from 10.0.133.159:38008: EOF" Apr 22 19:48:15.750338 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:15.750308 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hvh5z_1d48cabd-e6ed-491b-9c22-028c42616dca/global-pull-secret-syncer/0.log" Apr 22 19:48:15.956053 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:15.956021 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gkcvh_59a9ae74-55c2-42a6-b0ec-5e69fea1ba3d/konnectivity-agent/0.log" Apr 22 19:48:16.017891 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:16.017797 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-132.ec2.internal_e33fdcedf15ae63b7d7592c581d6241e/haproxy/0.log" Apr 22 19:48:19.997181 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:19.997150 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-9zrx8_865bb00c-3a97-4aab-a7bd-cca3ef857f4f/monitoring-plugin/0.log" Apr 22 19:48:20.127118 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:20.127087 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n7wh4_b5f7f302-1d58-4544-8b02-0f35e261666a/node-exporter/0.log" Apr 22 19:48:20.152558 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:20.152528 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n7wh4_b5f7f302-1d58-4544-8b02-0f35e261666a/kube-rbac-proxy/0.log" Apr 22 19:48:20.179117 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:20.179093 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n7wh4_b5f7f302-1d58-4544-8b02-0f35e261666a/init-textfile/0.log" Apr 22 19:48:20.612529 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:20.612499 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t2lpj_eba12bbc-2b28-461b-b407-96b6fb2de23e/prometheus-operator/0.log" Apr 22 19:48:20.641884 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:20.641851 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t2lpj_eba12bbc-2b28-461b-b407-96b6fb2de23e/kube-rbac-proxy/0.log" Apr 22 19:48:22.914323 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.914283 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7"] Apr 22 19:48:22.917379 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.917360 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:22.920023 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.919994 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-skx6f\"/\"kube-root-ca.crt\"" Apr 22 19:48:22.921090 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.921070 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-skx6f\"/\"default-dockercfg-6vwdj\"" Apr 22 19:48:22.921152 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.921093 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-skx6f\"/\"openshift-service-ca.crt\"" Apr 22 19:48:22.930001 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.929974 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7"] Apr 22 19:48:22.940908 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.940875 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-proc\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:22.941085 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.940923 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-podres\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:22.941085 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.940947 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-sys\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:22.941085 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.941028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz4v6\" (UniqueName: \"kubernetes.io/projected/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-kube-api-access-cz4v6\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:22.941085 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:22.941067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-lib-modules\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.041758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.041717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-podres\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.041758 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.041767 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-sys\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.041998 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.041812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz4v6\" (UniqueName: \"kubernetes.io/projected/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-kube-api-access-cz4v6\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.041998 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.041841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-lib-modules\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.041998 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.041857 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-proc\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.041998 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.041906 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-podres\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.041998 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.041942 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-sys\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.041998 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.041949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-proc\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.042202 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.042003 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-lib-modules\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.051338 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.051305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz4v6\" (UniqueName: \"kubernetes.io/projected/334d75e9-0ff4-449f-bef2-d51eaeb3d15b-kube-api-access-cz4v6\") pod \"perf-node-gather-daemonset-lh2g7\" (UID: \"334d75e9-0ff4-449f-bef2-d51eaeb3d15b\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.227761 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.227727 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:23.354174 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.354136 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7"] Apr 22 19:48:23.357680 ip-10-0-131-132 kubenswrapper[2570]: W0422 19:48:23.357645 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod334d75e9_0ff4_449f_bef2_d51eaeb3d15b.slice/crio-537cf20734ae5e32aadf3e6f143e270ae7674b605b14cad63cef57c2e7e13cf2 WatchSource:0}: Error finding container 537cf20734ae5e32aadf3e6f143e270ae7674b605b14cad63cef57c2e7e13cf2: Status 404 returned error can't find the container with id 537cf20734ae5e32aadf3e6f143e270ae7674b605b14cad63cef57c2e7e13cf2 Apr 22 19:48:23.359382 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.359361 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:48:23.988304 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.988262 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" event={"ID":"334d75e9-0ff4-449f-bef2-d51eaeb3d15b","Type":"ContainerStarted","Data":"b1163bc2750bdd6019e7304b9ee39f2c72d764a5652bc8948668c8d83a7dcd41"} Apr 22 19:48:23.988304 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.988303 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" event={"ID":"334d75e9-0ff4-449f-bef2-d51eaeb3d15b","Type":"ContainerStarted","Data":"537cf20734ae5e32aadf3e6f143e270ae7674b605b14cad63cef57c2e7e13cf2"} Apr 22 19:48:23.988857 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:23.988390 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:24.007090 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:24.007043 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" podStartSLOduration=2.007028014 podStartE2EDuration="2.007028014s" podCreationTimestamp="2026-04-22 19:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:48:24.005817989 +0000 UTC m=+1493.056181513" watchObservedRunningTime="2026-04-22 19:48:24.007028014 +0000 UTC m=+1493.057391517" Apr 22 19:48:24.196990 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:24.196959 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-95lfv_638b7915-fc24-4304-b691-5e2dd5b5a7ce/dns/0.log" Apr 22 19:48:24.222804 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:24.222769 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-95lfv_638b7915-fc24-4304-b691-5e2dd5b5a7ce/kube-rbac-proxy/0.log" Apr 22 19:48:24.404182 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:24.404107 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8w5lr_9d618274-e61e-4ac5-b98d-0316d3addc15/dns-node-resolver/0.log" Apr 22 19:48:24.955905 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:24.955873 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6gbdc_3b3a8d77-2840-4166-a03a-e49d2f4f7de6/node-ca/0.log" Apr 22 19:48:26.111720 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:26.111690 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8vz9d_44c36a5b-c1fd-4922-93c7-7d7e2ee8797e/serve-healthcheck-canary/0.log" Apr 22 19:48:26.772856 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:26.772832 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rq2j9_f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865/kube-rbac-proxy/0.log" Apr 22 19:48:26.800450 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:26.800423 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rq2j9_f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865/exporter/0.log" Apr 22 19:48:26.830615 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:26.830586 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rq2j9_f0ff4ce4-19f2-47d6-a74f-1ba8bf8b8865/extractor/0.log" Apr 22 19:48:28.966048 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:28.966016 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-x65ph_3210e907-7781-4722-a0c7-86c7c08d668a/manager/0.log" Apr 22 19:48:29.144438 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:29.144399 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-qntmp_8b297674-2f6d-487a-a011-5a60de622123/manager/0.log" Apr 22 19:48:30.000342 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:30.000315 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-lh2g7" Apr 22 19:48:31.557651 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:31.557620 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:48:31.558087 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:31.557751 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:48:35.481269 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.481239 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kxfcc_55c68bde-7e46-4a89-a5ae-8a4047fde6e7/kube-multus-additional-cni-plugins/0.log" Apr 22 19:48:35.510549 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.510519 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kxfcc_55c68bde-7e46-4a89-a5ae-8a4047fde6e7/egress-router-binary-copy/0.log" Apr 22 19:48:35.540964 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.540939 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kxfcc_55c68bde-7e46-4a89-a5ae-8a4047fde6e7/cni-plugins/0.log" Apr 22 19:48:35.570190 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.570160 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kxfcc_55c68bde-7e46-4a89-a5ae-8a4047fde6e7/bond-cni-plugin/0.log" Apr 22 19:48:35.598240 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.598209 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kxfcc_55c68bde-7e46-4a89-a5ae-8a4047fde6e7/routeoverride-cni/0.log" Apr 22 19:48:35.628687 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.628659 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kxfcc_55c68bde-7e46-4a89-a5ae-8a4047fde6e7/whereabouts-cni-bincopy/0.log" Apr 22 19:48:35.657735 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.657702 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kxfcc_55c68bde-7e46-4a89-a5ae-8a4047fde6e7/whereabouts-cni/0.log" Apr 22 19:48:35.731430 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.731337 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnx9f_9b64e2bc-78bb-4188-ae32-9f0e5f92f75a/kube-multus/0.log" Apr 22 19:48:35.792219 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.792187 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gwm2k_5df89727-eca2-4929-8ab4-9c1a7832889b/network-metrics-daemon/0.log" Apr 22 19:48:35.822089 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:35.822059 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gwm2k_5df89727-eca2-4929-8ab4-9c1a7832889b/kube-rbac-proxy/0.log" Apr 22 19:48:37.155179 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.155138 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-controller/0.log" Apr 22 19:48:37.184548 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.184517 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/0.log" Apr 22 19:48:37.198257 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.198202 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovn-acl-logging/1.log" Apr 22 19:48:37.223315 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.223283 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/kube-rbac-proxy-node/0.log" Apr 22 19:48:37.256389 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.256359 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:48:37.283770 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.283733 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/northd/0.log" Apr 22 19:48:37.311893 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.311864 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/nbdb/0.log" Apr 22 19:48:37.347195 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.347167 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/sbdb/0.log" Apr 22 19:48:37.501034 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:37.500988 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cszr4_2bee9e68-7e05-489d-adaf-1e469041f7c1/ovnkube-controller/0.log" Apr 22 19:48:38.892713 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:38.892681 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mch7s_d5823fb1-e8f7-45fb-911a-f3cbcc56dfc8/network-check-target-container/0.log" Apr 22 19:48:39.897048 ip-10-0-131-132 kubenswrapper[2570]: I0422 19:48:39.897022 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-mgsvn_0f0df841-c168-48f6-9e2e-f209a8216c52/iptables-alerter/0.log"