Apr 21 10:01:14.630297 ip-10-0-137-205 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 10:01:14.630309 ip-10-0-137-205 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 10:01:14.630316 ip-10-0-137-205 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 10:01:14.630517 ip-10-0-137-205 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 10:01:24.759023 ip-10-0-137-205 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 10:01:24.759045 ip-10-0-137-205 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b8e963a70ad04d1ca8dbd9aaebadd752 -- Apr 21 10:03:45.490147 ip-10-0-137-205 systemd[1]: Starting Kubernetes Kubelet... Apr 21 10:03:45.967748 ip-10-0-137-205 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:45.967748 ip-10-0-137-205 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 10:03:45.967748 ip-10-0-137-205 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:45.967748 ip-10-0-137-205 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 10:03:45.967748 ip-10-0-137-205 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:45.969314 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.969221 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 10:03:45.971668 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971648 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:45.971668 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971664 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:45.971668 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971669 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:45.971668 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971673 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971677 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971682 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971686 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971691 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971694 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971698 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971701 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971705 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971709 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971719 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971723 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971727 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971731 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971735 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971740 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971744 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971747 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971752 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971756 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:45.971928 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971759 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971764 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971768 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971772 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971776 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971779 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971783 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971788 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971792 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971799 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971805 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971810 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971814 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971819 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971824 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971829 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971835 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971839 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971844 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:45.972727 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971851 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971859 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971863 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971867 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971872 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971894 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971900 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971905 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971909 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971914 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971918 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971923 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971927 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971932 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971936 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971940 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971945 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971949 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971953 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971957 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:45.973448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971962 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971966 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971971 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971976 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971980 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971984 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971988 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971995 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.971999 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972004 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972008 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972015 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972019 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972023 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972027 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972031 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972036 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972040 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972044 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:45.974172 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972049 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972053 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972057 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972061 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972065 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972732 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972741 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972747 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972751 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972756 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972760 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972764 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972769 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972773 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972777 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972789 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972793 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972798 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972802 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972806 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:45.975018 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972811 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972816 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972820 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972825 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972828 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972832 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972836 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972841 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972845 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972849 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972852 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972856 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972861 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972865 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972869 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972873 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972894 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972899 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972903 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972907 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:45.975671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972911 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972914 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972919 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972923 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972927 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972931 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972935 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972939 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972951 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972958 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972963 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972970 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972977 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972983 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972988 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972992 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.972997 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973001 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973005 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:45.976214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973009 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973013 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973017 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973022 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973028 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973033 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973038 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973043 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973047 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973051 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973057 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973061 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973065 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973069 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973073 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973077 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973081 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973086 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973090 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:45.976737 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973094 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973098 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973102 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973111 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973115 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973120 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973124 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973128 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973133 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973137 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973141 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973145 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.973149 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973846 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973868 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973900 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973907 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973914 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973919 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973926 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973933 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 10:03:45.977454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973939 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973944 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973949 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973955 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973960 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973964 2575 flags.go:64] FLAG: --cgroup-root="" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973969 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973974 2575 flags.go:64] FLAG: --client-ca-file="" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973979 2575 flags.go:64] FLAG: --cloud-config="" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973986 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.973991 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974000 2575 flags.go:64] FLAG: --cluster-domain="" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974005 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974011 2575 flags.go:64] FLAG: --config-dir="" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974015 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974022 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974028 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974033 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974039 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974045 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974050 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974055 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974060 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974065 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974070 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 10:03:45.978010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974077 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974082 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974087 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974092 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974097 2575 flags.go:64] FLAG: --enable-server="true" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974101 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974108 2575 flags.go:64] FLAG: --event-burst="100" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974113 2575 flags.go:64] FLAG: --event-qps="50" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974118 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974123 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974127 2575 flags.go:64] FLAG: --eviction-hard="" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974134 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974138 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974143 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974149 2575 flags.go:64] FLAG: --eviction-soft="" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974153 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974160 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974164 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974169 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974174 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974178 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974182 2575 flags.go:64] FLAG: --feature-gates="" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974188 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974194 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974200 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 10:03:45.978720 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974205 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974211 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974216 2575 flags.go:64] FLAG: --help="false" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974221 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-137-205.ec2.internal" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974226 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974231 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974235 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974241 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974247 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974252 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974256 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974261 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974266 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974271 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974276 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974280 2575 flags.go:64] FLAG: --kube-reserved="" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974285 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974290 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974295 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974299 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974304 2575 flags.go:64] FLAG: --lock-file="" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974309 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974313 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974318 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 10:03:45.979362 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974327 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974332 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974337 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974342 2575 flags.go:64] FLAG: --logging-format="text" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974347 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974352 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974357 2575 flags.go:64] FLAG: --manifest-url="" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974364 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974372 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974377 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974383 2575 flags.go:64] FLAG: --max-pods="110" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974389 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974394 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974398 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974403 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974408 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974413 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974418 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974430 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974435 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974440 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974444 2575 flags.go:64] FLAG: --pod-cidr="" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974449 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 10:03:45.979964 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974459 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974463 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974468 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974474 2575 flags.go:64] FLAG: --port="10250" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974478 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974483 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bc88427d43905c29" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974488 2575 flags.go:64] FLAG: --qos-reserved="" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974492 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974498 2575 flags.go:64] FLAG: --register-node="true" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974503 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974508 2575 flags.go:64] FLAG: --register-with-taints="" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974513 2575 flags.go:64] FLAG: --registry-burst="10" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974518 2575 flags.go:64] FLAG: --registry-qps="5" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974523 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974528 2575 flags.go:64] FLAG: --reserved-memory="" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974534 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974539 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974545 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974550 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974555 2575 flags.go:64] FLAG: --runonce="false" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974560 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974566 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974571 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974575 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974580 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974585 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 10:03:45.980509 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974590 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974595 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974600 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974604 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974609 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974614 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974619 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974632 2575 flags.go:64] FLAG: --system-cgroups="" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974638 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974646 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974651 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974658 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974666 2575 flags.go:64] FLAG: --tls-min-version="" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974670 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974675 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974680 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974685 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974690 2575 flags.go:64] FLAG: --v="2" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974697 2575 flags.go:64] FLAG: --version="false" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974704 2575 flags.go:64] FLAG: --vmodule="" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974711 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.974717 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974896 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974903 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:45.981197 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974911 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974918 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974922 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974926 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974930 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974935 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974939 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974943 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974948 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974952 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974956 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974960 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974964 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974968 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974972 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974976 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974981 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974985 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974989 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:45.982112 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.974996 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975000 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975005 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975009 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975017 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975022 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975026 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975030 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975034 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975038 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975042 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975048 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975054 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975060 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975071 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975076 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975080 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975084 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975089 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975093 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:45.982966 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975097 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975101 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975106 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975110 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975114 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975118 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975122 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975127 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975131 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975135 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975139 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975143 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975150 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975154 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975158 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975164 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975169 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975173 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975177 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975181 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:45.983753 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975186 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975190 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975194 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975198 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975203 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975207 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975211 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975220 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975224 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975229 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975233 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975237 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975241 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975245 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975250 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975254 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975258 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975262 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975266 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975270 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:45.984424 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975274 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975278 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975282 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975286 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.975292 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.976180 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.984220 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.984396 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984471 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984478 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984483 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984487 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984491 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984496 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984500 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984504 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:45.984950 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984508 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984512 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984516 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984521 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984525 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984529 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984533 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984538 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984542 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984546 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984551 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984555 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984559 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984565 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984569 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984572 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984575 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984578 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984580 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984583 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:45.985391 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984586 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984589 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984591 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984599 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984602 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984605 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984608 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984611 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984613 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984616 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984618 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984621 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984624 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984626 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984629 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984631 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984634 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984636 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984639 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984641 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:45.985952 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984644 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984647 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984650 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984652 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984655 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984657 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984660 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984663 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984666 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984668 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984671 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984673 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984677 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984681 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984684 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984688 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984691 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984694 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984696 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:45.986448 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984699 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984702 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984706 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984711 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984714 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984717 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984720 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984723 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984726 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984728 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984731 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984734 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984737 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984740 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984742 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984745 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984748 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984750 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:45.986945 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984753 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.984758 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984852 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984857 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984860 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984863 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984866 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984870 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984874 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984896 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984900 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984903 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984906 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984909 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984912 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:45.987383 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984915 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984918 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984920 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984923 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984926 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984929 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984931 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984935 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984938 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984944 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984946 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984949 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984952 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984954 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984957 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984960 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984962 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984965 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984968 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:45.987748 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984970 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984973 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984976 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984978 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984981 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984983 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984986 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984990 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984992 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984995 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.984997 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985000 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985002 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985005 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985008 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985010 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985012 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985015 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985018 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:45.988232 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985020 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985023 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985026 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985028 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985031 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985034 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985036 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985039 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985041 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985044 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985046 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985048 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985051 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985054 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985056 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985059 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985062 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985065 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985067 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985070 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:45.988699 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985072 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985075 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985078 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985080 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985083 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985086 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985088 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985091 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985093 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985096 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985098 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985101 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985103 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985106 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:45.985108 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.985114 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:45.989194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.985234 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 10:03:45.989590 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.989065 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 10:03:45.990167 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.990154 2575 server.go:1019] "Starting client certificate rotation" Apr 21 10:03:45.990266 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.990252 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:45.990302 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:45.990292 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:46.018916 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.018897 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:46.023443 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.023423 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:46.039789 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.039768 2575 log.go:25] "Validated CRI v1 runtime API" Apr 21 10:03:46.046031 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.046014 2575 log.go:25] "Validated CRI v1 image API" Apr 21 10:03:46.047313 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.047299 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 10:03:46.047640 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.047625 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:46.055592 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.055559 2575 fs.go:135] Filesystem UUIDs: map[3576ab75-931b-4cdb-8af1-a25b84ff52f1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 eaa16d4c-01e0-4caa-9d86-fc8129eed038:/dev/nvme0n1p3] Apr 21 10:03:46.055686 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.055593 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 10:03:46.061607 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.061496 2575 manager.go:217] Machine: {Timestamp:2026-04-21 10:03:46.059662017 +0000 UTC m=+0.444500935 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099384 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22f38c56a674500c8e472c8bc6faba SystemUUID:ec22f38c-56a6-7450-0c8e-472c8bc6faba BootID:b8e963a7-0ad0-4d1c-a8db-d9aaebadd752 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8e:8a:59:46:9f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8e:8a:59:46:9f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:17:76:39:e2:48 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 10:03:46.061607 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.061597 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 10:03:46.061738 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.061715 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 10:03:46.062936 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.062909 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 10:03:46.063094 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.062937 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-205.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 10:03:46.063169 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.063109 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 10:03:46.063169 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.063122 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 10:03:46.063169 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.063141 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:46.063910 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.063897 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:46.064142 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.064125 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5d967" Apr 21 10:03:46.065099 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.065085 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:46.065226 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.065215 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 10:03:46.068734 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.068724 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 21 10:03:46.068804 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.068742 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 10:03:46.068804 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.068758 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 10:03:46.068804 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.068771 2575 kubelet.go:397] "Adding apiserver pod source" Apr 21 10:03:46.068804 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.068783 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 10:03:46.070220 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.070206 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:46.070285 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.070230 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:46.071077 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.071061 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5d967" Apr 21 10:03:46.073658 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.073643 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 10:03:46.075096 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.075080 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 10:03:46.076826 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076814 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 10:03:46.076897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076831 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 10:03:46.076897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076837 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 10:03:46.076897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076843 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 10:03:46.076897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076848 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 10:03:46.076897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076854 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 10:03:46.076897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076860 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 10:03:46.076897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076866 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 10:03:46.076897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076894 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 10:03:46.077103 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076905 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 10:03:46.077103 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076918 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 10:03:46.077103 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.076931 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 10:03:46.078536 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.078523 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 10:03:46.078575 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.078538 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 10:03:46.081916 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.081900 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:46.081983 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.081971 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 10:03:46.082024 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.082006 2575 server.go:1295] "Started kubelet" Apr 21 10:03:46.082110 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.082087 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 10:03:46.082170 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.082089 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 10:03:46.082170 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.082151 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 10:03:46.082837 ip-10-0-137-205 systemd[1]: Started Kubernetes Kubelet. Apr 21 10:03:46.088118 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.088096 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 10:03:46.088795 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.088772 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:46.089068 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.089049 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 21 10:03:46.091020 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.090996 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-205.ec2.internal" not found Apr 21 10:03:46.095069 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.095046 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:46.095626 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.095608 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 10:03:46.096255 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096233 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 10:03:46.096255 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096255 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 10:03:46.096385 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096232 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 10:03:46.096385 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096313 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 21 10:03:46.096385 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096322 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 21 10:03:46.096611 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096507 2575 factory.go:55] Registering systemd factory Apr 21 10:03:46.096611 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096522 2575 factory.go:223] Registration of the systemd container factory successfully Apr 21 10:03:46.096773 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096753 2575 factory.go:153] Registering CRI-O factory Apr 21 10:03:46.096846 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096775 2575 factory.go:223] Registration of the crio container factory successfully Apr 21 10:03:46.096846 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096823 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 10:03:46.096846 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096845 2575 factory.go:103] Registering Raw factory Apr 21 10:03:46.097012 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.096858 2575 manager.go:1196] Started watching for new ooms in manager Apr 21 10:03:46.097059 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:46.097030 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-205.ec2.internal\" not found" Apr 21 10:03:46.097267 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.097253 2575 manager.go:319] Starting recovery of all containers Apr 21 10:03:46.098767 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.098743 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:46.099003 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:46.098970 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 10:03:46.102241 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:46.102220 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-205.ec2.internal\" not found" node="ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.106314 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.106292 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-205.ec2.internal" not found Apr 21 10:03:46.107637 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.107620 2575 manager.go:324] Recovery completed Apr 21 10:03:46.112099 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.112086 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:46.114035 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.114008 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-205.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:46.114119 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.114045 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:46.114119 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.114055 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-205.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:46.114508 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.114494 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 10:03:46.114557 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.114507 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 10:03:46.114557 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.114524 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:46.117113 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.117102 2575 policy_none.go:49] "None policy: Start" Apr 21 10:03:46.117149 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.117118 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 10:03:46.117149 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.117127 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.155427 2575 manager.go:341] "Starting Device Plugin manager" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:46.155453 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.155462 2575 server.go:85] "Starting device plugin registration server" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.155698 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.155710 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.155792 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.155858 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.155866 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:46.156658 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 10:03:46.159435 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:46.156694 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-205.ec2.internal\" not found" Apr 21 10:03:46.166866 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.166841 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-205.ec2.internal" not found Apr 21 10:03:46.216613 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.216583 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 10:03:46.217841 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.217780 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 10:03:46.217841 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.217807 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 10:03:46.217841 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.217834 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 10:03:46.217841 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.217841 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 10:03:46.218067 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:46.217938 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 10:03:46.220351 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.220332 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:46.256192 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.256149 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:46.257781 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.257764 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-205.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:46.257854 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.257795 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:46.257854 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.257806 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-205.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:46.257854 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.257828 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.266189 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.266166 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.318804 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.318770 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal"] Apr 21 10:03:46.321795 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.321766 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.321924 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.321776 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.347578 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.347556 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.351904 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.351871 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.369722 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.369703 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:46.369722 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.369716 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:46.397907 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.397870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/788d8cac45308933e675cf575b113182-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal\" (UID: \"788d8cac45308933e675cf575b113182\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.398004 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.397912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/788d8cac45308933e675cf575b113182-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal\" (UID: \"788d8cac45308933e675cf575b113182\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.398004 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.397937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2aa63aeeee1f49e020d8254357661d3b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-205.ec2.internal\" (UID: \"2aa63aeeee1f49e020d8254357661d3b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.498295 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.498229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/788d8cac45308933e675cf575b113182-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal\" (UID: \"788d8cac45308933e675cf575b113182\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.498295 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.498258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2aa63aeeee1f49e020d8254357661d3b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-205.ec2.internal\" (UID: \"2aa63aeeee1f49e020d8254357661d3b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.498295 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.498276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/788d8cac45308933e675cf575b113182-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal\" (UID: \"788d8cac45308933e675cf575b113182\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.498462 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.498311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2aa63aeeee1f49e020d8254357661d3b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-205.ec2.internal\" (UID: \"2aa63aeeee1f49e020d8254357661d3b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.498462 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.498338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/788d8cac45308933e675cf575b113182-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal\" (UID: \"788d8cac45308933e675cf575b113182\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.498462 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.498365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/788d8cac45308933e675cf575b113182-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal\" (UID: \"788d8cac45308933e675cf575b113182\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.673050 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.673014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.673208 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.673089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" Apr 21 10:03:46.990114 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.990050 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 10:03:46.990783 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.990192 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:46.990783 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.990230 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:46.990783 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:46.990257 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:47.069207 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.069178 2575 apiserver.go:52] "Watching apiserver" Apr 21 10:03:47.073195 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.073162 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 09:58:46 +0000 UTC" deadline="2027-12-27 16:19:33.030496006 +0000 UTC" Apr 21 10:03:47.073195 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.073192 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14766h15m45.957306601s" Apr 21 10:03:47.076990 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.076974 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 10:03:47.077317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.077292 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xmljw","openshift-multus/multus-lbjqk","openshift-multus/network-metrics-daemon-z7xfz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t","openshift-image-registry/node-ca-7p7gp","openshift-network-diagnostics/network-check-target-4kjcv","openshift-network-operator/iptables-alerter-xdnxh","openshift-ovn-kubernetes/ovnkube-node-w88kn","kube-system/konnectivity-agent-qlmqr","kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal","openshift-cluster-node-tuning-operator/tuned-zpdx2","openshift-dns/node-resolver-9nzkd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal"] Apr 21 10:03:47.079058 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.079043 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.080091 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.080072 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.081170 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.081152 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:47.081258 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.081220 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:03:47.081793 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.081760 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 10:03:47.081873 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.081836 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 10:03:47.081954 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.081919 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 10:03:47.082020 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.082003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-57929\"" Apr 21 10:03:47.082068 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.082025 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 10:03:47.082108 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.082081 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 10:03:47.083122 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.082825 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 10:03:47.083122 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.082850 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.083122 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.082835 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-whmjl\"" Apr 21 10:03:47.085151 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.084988 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.085833 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.085814 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 10:03:47.085948 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.085845 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 10:03:47.085948 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.085929 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qh762\"" Apr 21 10:03:47.086053 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.085989 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 10:03:47.087055 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.087039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:47.087129 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.087095 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:03:47.088296 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.088278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.089681 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.089664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.090138 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.090118 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 10:03:47.090215 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.090168 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 10:03:47.090270 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.090127 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vj7ql\"" Apr 21 10:03:47.090396 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.090382 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 10:03:47.090645 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.090614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:47.090756 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.090680 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8b4v4\"" Apr 21 10:03:47.090869 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.090852 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 10:03:47.090995 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.090968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:03:47.091435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.091419 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:47.092797 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.092781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.093353 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093334 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 10:03:47.093474 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093370 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 10:03:47.093474 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093401 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 10:03:47.093474 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093424 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 10:03:47.093474 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093465 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 10:03:47.093714 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093480 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5wfdc\"" Apr 21 10:03:47.093714 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093424 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 10:03:47.093714 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093508 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9xwmk\"" Apr 21 10:03:47.093714 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093597 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 10:03:47.093860 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.093840 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 10:03:47.094119 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.094108 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.095141 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.095126 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:47.095216 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.095146 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:47.095216 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.095207 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ghbqc\"" Apr 21 10:03:47.095541 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.095519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:47.096329 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.096314 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 10:03:47.096466 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.096288 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 10:03:47.096640 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.096624 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8q6xb\"" Apr 21 10:03:47.097139 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.097121 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 10:03:47.100343 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-kubelet\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.100416 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.100416 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-os-release\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.100480 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.100480 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-hostroot\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.100480 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:47.100569 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxck8\" (UniqueName: \"kubernetes.io/projected/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-kube-api-access-nxck8\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.100569 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100524 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-etc-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.100569 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100548 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-ovn\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.100651 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-netns\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.100651 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-etc-kubernetes\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.100651 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdd9\" (UniqueName: \"kubernetes.io/projected/1100ece6-afda-453b-8595-490c94dbb90d-kube-api-access-dxdd9\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:47.100651 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100636 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-ovnkube-config\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.100794 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100658 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-ovnkube-script-lib\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.100794 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.100675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-sys\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.101116 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ds7\" (UniqueName: \"kubernetes.io/projected/cabbe82c-cbeb-4577-9474-2dce5128f826-kube-api-access-v8ds7\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.101165 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eac76dbb-f531-4a5b-a588-7457fe7db6c4-cni-binary-copy\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.101165 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lt8x\" (UniqueName: \"kubernetes.io/projected/bf91954e-2bd6-4597-b660-140925e88c87-kube-api-access-2lt8x\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.101228 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cnibin\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.101228 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cni-binary-copy\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.101228 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvjd4\" (UniqueName: \"kubernetes.io/projected/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-kube-api-access-bvjd4\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.101325 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-iptables-alerter-script\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.101325 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-node-log\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.101325 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-system-cni-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.101325 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101277 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-socket-dir-parent\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.101325 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.101325 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-modprobe-d\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-cni-bin\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/629199d2-db97-402a-9f00-3541bd582211-agent-certs\") pod \"konnectivity-agent-qlmqr\" (UID: \"629199d2-db97-402a-9f00-3541bd582211\") " pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-systemd\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-k8s-cni-cncf-io\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-device-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-system-cni-dir\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101541 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-systemd\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101556 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-cni-netd\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.101570 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-kubernetes\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-run\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101626 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-sys-fs\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101649 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-lib-modules\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cabbe82c-cbeb-4577-9474-2dce5128f826-tmp\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-socket-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf91954e-2bd6-4597-b660-140925e88c87-host\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-run-netns\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101784 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkd4p\" (UniqueName: \"kubernetes.io/projected/78e28615-5373-42a5-a30d-cd814a0943b4-kube-api-access-bkd4p\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysconfig\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-cni-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-registration-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-var-lib-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/629199d2-db97-402a-9f00-3541bd582211-konnectivity-ca\") pod \"konnectivity-agent-qlmqr\" (UID: \"629199d2-db97-402a-9f00-3541bd582211\") " pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.101983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-var-lib-kubelet\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102005 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf91954e-2bd6-4597-b660-140925e88c87-serviceca\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-slash\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-cni-multus\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-conf-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-multus-certs\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8df\" (UniqueName: \"kubernetes.io/projected/284c2d4a-a157-4bc4-a0db-be9046acd561-kube-api-access-pb8df\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqx8\" (UniqueName: \"kubernetes.io/projected/15e1aa47-50b2-4765-a870-9c646ae4fb01-kube-api-access-zbqx8\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102267 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-systemd-units\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78e28615-5373-42a5-a30d-cd814a0943b4-ovn-node-metrics-cert\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-host-slash\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-kubelet\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-cni-bin\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-tuned\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-os-release\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysctl-d\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysctl-conf\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.102897 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102504 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-env-overrides\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102553 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-daemon-config\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kbsp\" (UniqueName: \"kubernetes.io/projected/eac76dbb-f531-4a5b-a588-7457fe7db6c4-kube-api-access-8kbsp\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-etc-selinux\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/15e1aa47-50b2-4765-a870-9c646ae4fb01-hosts-file\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102670 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15e1aa47-50b2-4765-a870-9c646ae4fb01-tmp-dir\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-log-socket\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-host\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.103337 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.102781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-cnibin\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.105977 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.105959 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:47.127560 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.127539 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-b6g4h" Apr 21 10:03:47.135357 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.135339 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-b6g4h" Apr 21 10:03:47.137938 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.137901 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa63aeeee1f49e020d8254357661d3b.slice/crio-aa93a0b178f6950c7b3083aff7dd4c2f9ac4408f152e79036fcd7f481221da52 WatchSource:0}: Error finding container aa93a0b178f6950c7b3083aff7dd4c2f9ac4408f152e79036fcd7f481221da52: Status 404 returned error can't find the container with id aa93a0b178f6950c7b3083aff7dd4c2f9ac4408f152e79036fcd7f481221da52 Apr 21 10:03:47.138190 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.138174 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod788d8cac45308933e675cf575b113182.slice/crio-7bd38befb45b9c7b784b9839b87f3c1be333a8ca268e0109b1f496f6827cc4e7 WatchSource:0}: Error finding container 7bd38befb45b9c7b784b9839b87f3c1be333a8ca268e0109b1f496f6827cc4e7: Status 404 returned error can't find the container with id 7bd38befb45b9c7b784b9839b87f3c1be333a8ca268e0109b1f496f6827cc4e7 Apr 21 10:03:47.141949 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.141931 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:03:47.202987 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.202966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf91954e-2bd6-4597-b660-140925e88c87-serviceca\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.203095 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.202993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-slash\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203095 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-cni-multus\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203095 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-conf-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203095 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-cni-multus\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203095 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-conf-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203291 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-multus-certs\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203291 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-multus-certs\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203291 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:47.203291 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8df\" (UniqueName: \"kubernetes.io/projected/284c2d4a-a157-4bc4-a0db-be9046acd561-kube-api-access-pb8df\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.203291 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqx8\" (UniqueName: \"kubernetes.io/projected/15e1aa47-50b2-4765-a870-9c646ae4fb01-kube-api-access-zbqx8\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.203291 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-systemd-units\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203291 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-slash\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203291 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.203231 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.203301 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:47.703273641 +0000 UTC m=+2.088112550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf91954e-2bd6-4597-b660-140925e88c87-serviceca\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-systemd-units\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78e28615-5373-42a5-a30d-cd814a0943b4-ovn-node-metrics-cert\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-host-slash\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-kubelet\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-cni-bin\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-tuned\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.203564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-os-release\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysctl-d\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-cni-bin\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysctl-conf\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-os-release\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-kubelet\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-host-slash\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203702 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-env-overrides\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203702 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysctl-conf\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-daemon-config\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kbsp\" (UniqueName: \"kubernetes.io/projected/eac76dbb-f531-4a5b-a588-7457fe7db6c4-kube-api-access-8kbsp\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-etc-selinux\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysctl-d\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/15e1aa47-50b2-4765-a870-9c646ae4fb01-hosts-file\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.203994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15e1aa47-50b2-4765-a870-9c646ae4fb01-tmp-dir\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-etc-selinux\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-log-socket\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-log-socket\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-host\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-cnibin\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.203989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-kubelet\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-cnibin\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-os-release\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-kubelet\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-host\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-hostroot\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.204690 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-os-release\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxck8\" (UniqueName: \"kubernetes.io/projected/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-kube-api-access-nxck8\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-etc-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/15e1aa47-50b2-4765-a870-9c646ae4fb01-hosts-file\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-env-overrides\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-hostroot\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15e1aa47-50b2-4765-a870-9c646ae4fb01-tmp-dir\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-daemon-config\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-ovn\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-netns\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-etc-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204366 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-ovn\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-etc-kubernetes\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdd9\" (UniqueName: \"kubernetes.io/projected/1100ece6-afda-453b-8595-490c94dbb90d-kube-api-access-dxdd9\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-etc-kubernetes\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-netns\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.205412 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-ovnkube-config\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-ovnkube-script-lib\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-sys\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ds7\" (UniqueName: \"kubernetes.io/projected/cabbe82c-cbeb-4577-9474-2dce5128f826-kube-api-access-v8ds7\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eac76dbb-f531-4a5b-a588-7457fe7db6c4-cni-binary-copy\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lt8x\" (UniqueName: \"kubernetes.io/projected/bf91954e-2bd6-4597-b660-140925e88c87-kube-api-access-2lt8x\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.204998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cnibin\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cni-binary-copy\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvjd4\" (UniqueName: \"kubernetes.io/projected/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-kube-api-access-bvjd4\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-iptables-alerter-script\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-node-log\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-ovnkube-config\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-system-cni-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-socket-dir-parent\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-modprobe-d\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.206194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-cni-bin\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-var-lib-cni-bin\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.205850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cnibin\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78e28615-5373-42a5-a30d-cd814a0943b4-ovnkube-script-lib\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-modprobe-d\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-node-log\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cni-binary-copy\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-sys\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206495 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-system-cni-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-socket-dir-parent\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eac76dbb-f531-4a5b-a588-7457fe7db6c4-cni-binary-copy\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/629199d2-db97-402a-9f00-3541bd582211-agent-certs\") pod \"konnectivity-agent-qlmqr\" (UID: \"629199d2-db97-402a-9f00-3541bd582211\") " pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206868 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-systemd\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-k8s-cni-cncf-io\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.206960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-device-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.206994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-system-cni-dir\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-systemd\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-cni-netd\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-device-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-kubernetes\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-host-run-k8s-cni-cncf-io\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-iptables-alerter-script\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-kubernetes\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-run\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-systemd\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-sys-fs\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-run-systemd\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-sys-fs\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-run\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-cni-netd\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.207596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-system-cni-dir\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-lib-modules\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-lib-modules\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cabbe82c-cbeb-4577-9474-2dce5128f826-tmp\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-socket-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf91954e-2bd6-4597-b660-140925e88c87-host\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-run-netns\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78e28615-5373-42a5-a30d-cd814a0943b4-ovn-node-metrics-cert\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkd4p\" (UniqueName: \"kubernetes.io/projected/78e28615-5373-42a5-a30d-cd814a0943b4-kube-api-access-bkd4p\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-tuned\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-host-run-netns\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf91954e-2bd6-4597-b660-140925e88c87-host\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.207902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysconfig\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-cni-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-registration-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-socket-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.208317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-var-lib-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.209032 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/629199d2-db97-402a-9f00-3541bd582211-konnectivity-ca\") pod \"konnectivity-agent-qlmqr\" (UID: \"629199d2-db97-402a-9f00-3541bd582211\") " pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:03:47.209032 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eac76dbb-f531-4a5b-a588-7457fe7db6c4-multus-cni-dir\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.209032 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208367 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-var-lib-kubelet\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.209032 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-var-lib-kubelet\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.209032 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/284c2d4a-a157-4bc4-a0db-be9046acd561-registration-dir\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.209032 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78e28615-5373-42a5-a30d-cd814a0943b4-var-lib-openvswitch\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.209032 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.208659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cabbe82c-cbeb-4577-9474-2dce5128f826-etc-sysconfig\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.209327 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.209174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/629199d2-db97-402a-9f00-3541bd582211-konnectivity-ca\") pod \"konnectivity-agent-qlmqr\" (UID: \"629199d2-db97-402a-9f00-3541bd582211\") " pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:03:47.210536 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.210514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cabbe82c-cbeb-4577-9474-2dce5128f826-tmp\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.210649 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.210517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/629199d2-db97-402a-9f00-3541bd582211-agent-certs\") pod \"konnectivity-agent-qlmqr\" (UID: \"629199d2-db97-402a-9f00-3541bd582211\") " pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:03:47.216298 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.216270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8df\" (UniqueName: \"kubernetes.io/projected/284c2d4a-a157-4bc4-a0db-be9046acd561-kube-api-access-pb8df\") pod \"aws-ebs-csi-driver-node-cks8t\" (UID: \"284c2d4a-a157-4bc4-a0db-be9046acd561\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.217534 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.217506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqx8\" (UniqueName: \"kubernetes.io/projected/15e1aa47-50b2-4765-a870-9c646ae4fb01-kube-api-access-zbqx8\") pod \"node-resolver-9nzkd\" (UID: \"15e1aa47-50b2-4765-a870-9c646ae4fb01\") " pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.218859 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.218841 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:47.218975 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.218862 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:47.218975 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.218890 2575 projected.go:194] Error preparing data for projected volume kube-api-access-pd98l for pod openshift-network-diagnostics/network-check-target-4kjcv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:47.218975 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.218963 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l podName:983ab4c3-e7a5-4914-b247-d139ed1699ad nodeName:}" failed. No retries permitted until 2026-04-21 10:03:47.718944827 +0000 UTC m=+2.103783754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pd98l" (UniqueName: "kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l") pod "network-check-target-4kjcv" (UID: "983ab4c3-e7a5-4914-b247-d139ed1699ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:47.220044 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.220006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxck8\" (UniqueName: \"kubernetes.io/projected/99b900e4-2c7f-4af0-bb0e-3e9eef7571a5-kube-api-access-nxck8\") pod \"iptables-alerter-xdnxh\" (UID: \"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5\") " pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.220737 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.220715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdd9\" (UniqueName: \"kubernetes.io/projected/1100ece6-afda-453b-8595-490c94dbb90d-kube-api-access-dxdd9\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:47.221100 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.221053 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" event={"ID":"788d8cac45308933e675cf575b113182","Type":"ContainerStarted","Data":"7bd38befb45b9c7b784b9839b87f3c1be333a8ca268e0109b1f496f6827cc4e7"} Apr 21 10:03:47.221715 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.221683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvjd4\" (UniqueName: \"kubernetes.io/projected/d4d5b62e-5966-4d1f-b4ca-61a8acf47843-kube-api-access-bvjd4\") pod \"multus-additional-cni-plugins-xmljw\" (UID: \"d4d5b62e-5966-4d1f-b4ca-61a8acf47843\") " pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.222092 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.222075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ds7\" (UniqueName: \"kubernetes.io/projected/cabbe82c-cbeb-4577-9474-2dce5128f826-kube-api-access-v8ds7\") pod \"tuned-zpdx2\" (UID: \"cabbe82c-cbeb-4577-9474-2dce5128f826\") " pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.222179 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.222158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lt8x\" (UniqueName: \"kubernetes.io/projected/bf91954e-2bd6-4597-b660-140925e88c87-kube-api-access-2lt8x\") pod \"node-ca-7p7gp\" (UID: \"bf91954e-2bd6-4597-b660-140925e88c87\") " pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.222234 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.222216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" event={"ID":"2aa63aeeee1f49e020d8254357661d3b","Type":"ContainerStarted","Data":"aa93a0b178f6950c7b3083aff7dd4c2f9ac4408f152e79036fcd7f481221da52"} Apr 21 10:03:47.223064 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.223045 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkd4p\" (UniqueName: \"kubernetes.io/projected/78e28615-5373-42a5-a30d-cd814a0943b4-kube-api-access-bkd4p\") pod \"ovnkube-node-w88kn\" (UID: \"78e28615-5373-42a5-a30d-cd814a0943b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.223131 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.223069 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kbsp\" (UniqueName: \"kubernetes.io/projected/eac76dbb-f531-4a5b-a588-7457fe7db6c4-kube-api-access-8kbsp\") pod \"multus-lbjqk\" (UID: \"eac76dbb-f531-4a5b-a588-7457fe7db6c4\") " pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.410520 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.410398 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xmljw" Apr 21 10:03:47.416723 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.416699 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d5b62e_5966_4d1f_b4ca_61a8acf47843.slice/crio-365b953b3f54f4a5be14b9d060ede14c06cc6fe6c3aa29fb754381c3e960ca8e WatchSource:0}: Error finding container 365b953b3f54f4a5be14b9d060ede14c06cc6fe6c3aa29fb754381c3e960ca8e: Status 404 returned error can't find the container with id 365b953b3f54f4a5be14b9d060ede14c06cc6fe6c3aa29fb754381c3e960ca8e Apr 21 10:03:47.417338 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.417321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lbjqk" Apr 21 10:03:47.423726 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.423706 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeac76dbb_f531_4a5b_a588_7457fe7db6c4.slice/crio-32dacbe75b520f92c13769588d3e935df2054f8754b26e5f82ebc2c77e48f8e5 WatchSource:0}: Error finding container 32dacbe75b520f92c13769588d3e935df2054f8754b26e5f82ebc2c77e48f8e5: Status 404 returned error can't find the container with id 32dacbe75b520f92c13769588d3e935df2054f8754b26e5f82ebc2c77e48f8e5 Apr 21 10:03:47.444894 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.444849 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" Apr 21 10:03:47.448498 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.448463 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7p7gp" Apr 21 10:03:47.451916 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.451890 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod284c2d4a_a157_4bc4_a0db_be9046acd561.slice/crio-4be34cd565d4c3884bb28f7e0e54cc2ac1f4ad1f7a346a7954f29c09120b2f03 WatchSource:0}: Error finding container 4be34cd565d4c3884bb28f7e0e54cc2ac1f4ad1f7a346a7954f29c09120b2f03: Status 404 returned error can't find the container with id 4be34cd565d4c3884bb28f7e0e54cc2ac1f4ad1f7a346a7954f29c09120b2f03 Apr 21 10:03:47.455476 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.455456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xdnxh" Apr 21 10:03:47.456110 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.456091 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf91954e_2bd6_4597_b660_140925e88c87.slice/crio-e2c3619c395879cc86d381d22c3a6695777ae2061b135145f3451d1c35404dba WatchSource:0}: Error finding container e2c3619c395879cc86d381d22c3a6695777ae2061b135145f3451d1c35404dba: Status 404 returned error can't find the container with id e2c3619c395879cc86d381d22c3a6695777ae2061b135145f3451d1c35404dba Apr 21 10:03:47.460524 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.460503 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:03:47.465554 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.465531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:03:47.465816 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.465795 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b900e4_2c7f_4af0_bb0e_3e9eef7571a5.slice/crio-729fef68a595f6410414d39160a9716e4cc548599fb46d6fd081b889b3e39420 WatchSource:0}: Error finding container 729fef68a595f6410414d39160a9716e4cc548599fb46d6fd081b889b3e39420: Status 404 returned error can't find the container with id 729fef68a595f6410414d39160a9716e4cc548599fb46d6fd081b889b3e39420 Apr 21 10:03:47.471311 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.470424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" Apr 21 10:03:47.471311 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.470846 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78e28615_5373_42a5_a30d_cd814a0943b4.slice/crio-2a1b6fd704341a03c3be44571bec5ecd6f0c495bfda92be1ec6bcf3e279a66f2 WatchSource:0}: Error finding container 2a1b6fd704341a03c3be44571bec5ecd6f0c495bfda92be1ec6bcf3e279a66f2: Status 404 returned error can't find the container with id 2a1b6fd704341a03c3be44571bec5ecd6f0c495bfda92be1ec6bcf3e279a66f2 Apr 21 10:03:47.474236 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.474216 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod629199d2_db97_402a_9f00_3541bd582211.slice/crio-6447b09264a53eecd8ae35892489a6969b270b7c84d1402262ccce42d8628a74 WatchSource:0}: Error finding container 6447b09264a53eecd8ae35892489a6969b270b7c84d1402262ccce42d8628a74: Status 404 returned error can't find the container with id 6447b09264a53eecd8ae35892489a6969b270b7c84d1402262ccce42d8628a74 Apr 21 10:03:47.475015 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.474944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9nzkd" Apr 21 10:03:47.481158 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.481137 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcabbe82c_cbeb_4577_9474_2dce5128f826.slice/crio-4dccc76fe6034272a209506cdcf11f4d5c98985e6969e7f04d556d7266c97e49 WatchSource:0}: Error finding container 4dccc76fe6034272a209506cdcf11f4d5c98985e6969e7f04d556d7266c97e49: Status 404 returned error can't find the container with id 4dccc76fe6034272a209506cdcf11f4d5c98985e6969e7f04d556d7266c97e49 Apr 21 10:03:47.484624 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:03:47.484576 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15e1aa47_50b2_4765_a870_9c646ae4fb01.slice/crio-cb4c03a690df4c2c281ec91846ddacb8e0f1d661504db96ae42ff277278e4e97 WatchSource:0}: Error finding container cb4c03a690df4c2c281ec91846ddacb8e0f1d661504db96ae42ff277278e4e97: Status 404 returned error can't find the container with id cb4c03a690df4c2c281ec91846ddacb8e0f1d661504db96ae42ff277278e4e97 Apr 21 10:03:47.712339 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.712213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:47.712499 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.712379 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:47.712499 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.712450 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:48.712431214 +0000 UTC m=+3.097270136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:47.813271 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.812554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:47.813271 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.812700 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:47.813271 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.812718 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:47.813271 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.812731 2575 projected.go:194] Error preparing data for projected volume kube-api-access-pd98l for pod openshift-network-diagnostics/network-check-target-4kjcv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:47.813271 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:47.812789 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l podName:983ab4c3-e7a5-4914-b247-d139ed1699ad nodeName:}" failed. No retries permitted until 2026-04-21 10:03:48.812768967 +0000 UTC m=+3.197607894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pd98l" (UniqueName: "kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l") pod "network-check-target-4kjcv" (UID: "983ab4c3-e7a5-4914-b247-d139ed1699ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:47.861715 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.861416 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:47.985553 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:47.985474 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:48.136937 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.136832 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:47 +0000 UTC" deadline="2027-09-30 05:21:01.632875543 +0000 UTC" Apr 21 10:03:48.136937 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.136864 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12643h17m13.496015894s" Apr 21 10:03:48.242925 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.242797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qlmqr" event={"ID":"629199d2-db97-402a-9f00-3541bd582211","Type":"ContainerStarted","Data":"6447b09264a53eecd8ae35892489a6969b270b7c84d1402262ccce42d8628a74"} Apr 21 10:03:48.254682 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.254566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerStarted","Data":"365b953b3f54f4a5be14b9d060ede14c06cc6fe6c3aa29fb754381c3e960ca8e"} Apr 21 10:03:48.259853 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.259755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9nzkd" event={"ID":"15e1aa47-50b2-4765-a870-9c646ae4fb01","Type":"ContainerStarted","Data":"cb4c03a690df4c2c281ec91846ddacb8e0f1d661504db96ae42ff277278e4e97"} Apr 21 10:03:48.273299 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.273238 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" event={"ID":"cabbe82c-cbeb-4577-9474-2dce5128f826","Type":"ContainerStarted","Data":"4dccc76fe6034272a209506cdcf11f4d5c98985e6969e7f04d556d7266c97e49"} Apr 21 10:03:48.280518 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.280327 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:48.285177 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.285033 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"2a1b6fd704341a03c3be44571bec5ecd6f0c495bfda92be1ec6bcf3e279a66f2"} Apr 21 10:03:48.288083 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.287989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xdnxh" event={"ID":"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5","Type":"ContainerStarted","Data":"729fef68a595f6410414d39160a9716e4cc548599fb46d6fd081b889b3e39420"} Apr 21 10:03:48.291999 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.291970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7p7gp" event={"ID":"bf91954e-2bd6-4597-b660-140925e88c87","Type":"ContainerStarted","Data":"e2c3619c395879cc86d381d22c3a6695777ae2061b135145f3451d1c35404dba"} Apr 21 10:03:48.305729 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.305678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" event={"ID":"284c2d4a-a157-4bc4-a0db-be9046acd561","Type":"ContainerStarted","Data":"4be34cd565d4c3884bb28f7e0e54cc2ac1f4ad1f7a346a7954f29c09120b2f03"} Apr 21 10:03:48.322906 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.322845 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbjqk" event={"ID":"eac76dbb-f531-4a5b-a588-7457fe7db6c4","Type":"ContainerStarted","Data":"32dacbe75b520f92c13769588d3e935df2054f8754b26e5f82ebc2c77e48f8e5"} Apr 21 10:03:48.722080 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.721995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:48.722259 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:48.722165 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:48.722259 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:48.722226 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:50.722207165 +0000 UTC m=+5.107046073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:48.822801 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:48.822719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:48.822996 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:48.822874 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:48.822996 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:48.822914 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:48.822996 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:48.822931 2575 projected.go:194] Error preparing data for projected volume kube-api-access-pd98l for pod openshift-network-diagnostics/network-check-target-4kjcv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:48.822996 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:48.822993 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l podName:983ab4c3-e7a5-4914-b247-d139ed1699ad nodeName:}" failed. No retries permitted until 2026-04-21 10:03:50.822972918 +0000 UTC m=+5.207811842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pd98l" (UniqueName: "kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l") pod "network-check-target-4kjcv" (UID: "983ab4c3-e7a5-4914-b247-d139ed1699ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:49.137994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:49.137847 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:47 +0000 UTC" deadline="2027-10-12 04:15:21.202815206 +0000 UTC" Apr 21 10:03:49.137994 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:49.137901 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12930h11m32.064919114s" Apr 21 10:03:49.219306 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:49.218542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:49.219306 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:49.218674 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:03:49.219306 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:49.219127 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:49.219306 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:49.219231 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:03:50.737827 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:50.737749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:50.738291 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:50.737891 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:50.738291 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:50.737981 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.73795325 +0000 UTC m=+9.122792162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:50.838845 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:50.838808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:50.839033 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:50.838972 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:50.839033 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:50.838990 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:50.839033 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:50.839000 2575 projected.go:194] Error preparing data for projected volume kube-api-access-pd98l for pod openshift-network-diagnostics/network-check-target-4kjcv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:50.839193 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:50.839038 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l podName:983ab4c3-e7a5-4914-b247-d139ed1699ad nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.839025415 +0000 UTC m=+9.223864319 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pd98l" (UniqueName: "kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l") pod "network-check-target-4kjcv" (UID: "983ab4c3-e7a5-4914-b247-d139ed1699ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:51.218301 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:51.218227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:51.218461 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:51.218358 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:03:51.218461 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:51.218384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:51.218579 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:51.218479 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:03:52.555472 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.554599 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m5ktd"] Apr 21 10:03:52.558640 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.558607 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:52.558766 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:52.558699 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:03:52.652748 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.652713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-dbus\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:52.652748 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.652755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-kubelet-config\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:52.653037 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.652807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:52.753564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.753525 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:52.753741 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.753612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-dbus\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:52.753741 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.753644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-kubelet-config\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:52.753741 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:52.753680 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:52.753923 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.753746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-kubelet-config\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:52.753923 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:52.753755 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret podName:43b65a7c-65a4-4a35-8d96-82a1e1a9288d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:53.253735442 +0000 UTC m=+7.638574364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret") pod "global-pull-secret-syncer-m5ktd" (UID: "43b65a7c-65a4-4a35-8d96-82a1e1a9288d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:52.753923 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:52.753809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-dbus\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:53.218999 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:53.218966 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:53.219192 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:53.219020 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:53.219192 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:53.219105 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:03:53.219306 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:53.219259 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:03:53.258282 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:53.258246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:53.258444 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:53.258381 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:53.258444 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:53.258442 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret podName:43b65a7c-65a4-4a35-8d96-82a1e1a9288d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.258421304 +0000 UTC m=+8.643260212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret") pod "global-pull-secret-syncer-m5ktd" (UID: "43b65a7c-65a4-4a35-8d96-82a1e1a9288d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:54.219089 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:54.219042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:54.219530 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.219176 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:03:54.266002 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:54.265943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:54.266176 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.266084 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:54.266176 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.266145 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret podName:43b65a7c-65a4-4a35-8d96-82a1e1a9288d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:56.266125499 +0000 UTC m=+10.650964425 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret") pod "global-pull-secret-syncer-m5ktd" (UID: "43b65a7c-65a4-4a35-8d96-82a1e1a9288d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:54.770403 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:54.770362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:54.770600 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.770486 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.770600 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.770570 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:04:02.770549417 +0000 UTC m=+17.155388325 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.872157 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:54.872078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:54.872339 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.872304 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:54.872339 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.872332 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:54.872449 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.872346 2575 projected.go:194] Error preparing data for projected volume kube-api-access-pd98l for pod openshift-network-diagnostics/network-check-target-4kjcv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.872449 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:54.872432 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l podName:983ab4c3-e7a5-4914-b247-d139ed1699ad nodeName:}" failed. No retries permitted until 2026-04-21 10:04:02.872412646 +0000 UTC m=+17.257251563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pd98l" (UniqueName: "kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l") pod "network-check-target-4kjcv" (UID: "983ab4c3-e7a5-4914-b247-d139ed1699ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:55.218569 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:55.218477 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:55.218741 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:55.218596 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:03:55.219033 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:55.219013 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:55.219145 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:55.219121 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:03:56.218713 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:56.218678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:56.218900 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:56.218781 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:03:56.283953 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:56.283917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:56.284353 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:56.284056 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:56.284353 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:56.284113 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret podName:43b65a7c-65a4-4a35-8d96-82a1e1a9288d nodeName:}" failed. No retries permitted until 2026-04-21 10:04:00.284096681 +0000 UTC m=+14.668935606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret") pod "global-pull-secret-syncer-m5ktd" (UID: "43b65a7c-65a4-4a35-8d96-82a1e1a9288d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:57.218825 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:57.218786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:57.219024 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:57.218786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:57.219024 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:57.218918 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:03:57.219024 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:57.218996 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:03:58.218039 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:58.218005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:03:58.218458 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:58.218134 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:03:59.218517 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:59.218486 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:03:59.218947 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:03:59.218486 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:03:59.218947 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:59.218624 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:03:59.218947 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:03:59.218700 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:00.218391 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:00.218358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:00.218532 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:00.218510 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:00.315959 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:00.315924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:00.316110 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:00.316085 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:00.316163 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:00.316155 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret podName:43b65a7c-65a4-4a35-8d96-82a1e1a9288d nodeName:}" failed. No retries permitted until 2026-04-21 10:04:08.316135903 +0000 UTC m=+22.700974809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret") pod "global-pull-secret-syncer-m5ktd" (UID: "43b65a7c-65a4-4a35-8d96-82a1e1a9288d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:01.218228 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:01.218191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:01.218414 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:01.218191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:01.218414 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:01.218296 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:01.218414 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:01.218373 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:02.218680 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:02.218640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:02.219123 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:02.218757 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:02.835047 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:02.835015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:02.835315 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:02.835156 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:02.835315 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:02.835219 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.835199667 +0000 UTC m=+33.220038573 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:02.936117 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:02.936079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:02.936279 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:02.936207 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:02.936279 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:02.936224 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:02.936279 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:02.936234 2575 projected.go:194] Error preparing data for projected volume kube-api-access-pd98l for pod openshift-network-diagnostics/network-check-target-4kjcv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:02.936428 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:02.936288 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l podName:983ab4c3-e7a5-4914-b247-d139ed1699ad nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.936269736 +0000 UTC m=+33.321108646 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pd98l" (UniqueName: "kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l") pod "network-check-target-4kjcv" (UID: "983ab4c3-e7a5-4914-b247-d139ed1699ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:03.218912 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:03.218864 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:03.219332 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:03.218874 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:03.219332 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:03.218999 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:03.219332 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:03.219128 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:04.218284 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:04.218251 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:04.218469 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:04.218356 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:05.218172 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.218023 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:05.218422 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.218023 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:05.218422 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:05.218228 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:05.218422 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:05.218288 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:05.358535 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.358470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" event={"ID":"cabbe82c-cbeb-4577-9474-2dce5128f826","Type":"ContainerStarted","Data":"2d6c5a8f4ac077a6e20a5a8f9c363baa90fa92497e2965ba5595f73dfec8adda"} Apr 21 10:04:05.364151 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.363918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"eb31db38ae09d7e20b93e0b9408a6d442811e6f1cea0b14ac116a59bc0b16e70"} Apr 21 10:04:05.366290 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.366256 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbjqk" event={"ID":"eac76dbb-f531-4a5b-a588-7457fe7db6c4","Type":"ContainerStarted","Data":"fb9fe13b8cdf7c8183dfbc5ef7bed8df69ea453267152af9b42043ce433b20d5"} Apr 21 10:04:05.368245 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.368108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" event={"ID":"2aa63aeeee1f49e020d8254357661d3b","Type":"ContainerStarted","Data":"2308be2e55fceb0b44ed8c2145839682e7dacb7421fb19fad15e9ed786f31b9f"} Apr 21 10:04:05.387012 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.386971 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zpdx2" podStartSLOduration=1.940425963 podStartE2EDuration="19.386960546s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.483064076 +0000 UTC m=+1.867902983" lastFinishedPulling="2026-04-21 10:04:04.929598651 +0000 UTC m=+19.314437566" observedRunningTime="2026-04-21 10:04:05.386458766 +0000 UTC m=+19.771297688" watchObservedRunningTime="2026-04-21 10:04:05.386960546 +0000 UTC m=+19.771799472" Apr 21 10:04:05.414285 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.414249 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-205.ec2.internal" podStartSLOduration=19.414236709 podStartE2EDuration="19.414236709s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:04:05.398165031 +0000 UTC m=+19.783003958" watchObservedRunningTime="2026-04-21 10:04:05.414236709 +0000 UTC m=+19.799075635" Apr 21 10:04:05.414658 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:05.414550 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lbjqk" podStartSLOduration=1.868997039 podStartE2EDuration="19.414540938s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.425563832 +0000 UTC m=+1.810402740" lastFinishedPulling="2026-04-21 10:04:04.971107722 +0000 UTC m=+19.355946639" observedRunningTime="2026-04-21 10:04:05.413833067 +0000 UTC m=+19.798671993" watchObservedRunningTime="2026-04-21 10:04:05.414540938 +0000 UTC m=+19.799379866" Apr 21 10:04:06.219102 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.219072 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:06.219709 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:06.219183 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:06.371600 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.371562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7p7gp" event={"ID":"bf91954e-2bd6-4597-b660-140925e88c87","Type":"ContainerStarted","Data":"0ae9dc3b625df2152053f795dbe4e51e056fae00cd76fc593fc2b790ee7d26e6"} Apr 21 10:04:06.373001 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.372962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" event={"ID":"284c2d4a-a157-4bc4-a0db-be9046acd561","Type":"ContainerStarted","Data":"491e7ae4a7ed09036ea7ab049bea1ca12574e4ffd4a85d3c9304fa0a0b41dc3f"} Apr 21 10:04:06.374385 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.374357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qlmqr" event={"ID":"629199d2-db97-402a-9f00-3541bd582211","Type":"ContainerStarted","Data":"a95a87085c53f68cde452aad52974e5b45f7ebb230121edfbc62f04f5b6cad3e"} Apr 21 10:04:06.375684 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.375659 2575 generic.go:358] "Generic (PLEG): container finished" podID="d4d5b62e-5966-4d1f-b4ca-61a8acf47843" containerID="23502da050bbde27da909f81887141308ccfa591c69bf45ed24079579cdbee18" exitCode=0 Apr 21 10:04:06.375799 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.375728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerDied","Data":"23502da050bbde27da909f81887141308ccfa591c69bf45ed24079579cdbee18"} Apr 21 10:04:06.377137 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.377111 2575 generic.go:358] "Generic (PLEG): container finished" podID="788d8cac45308933e675cf575b113182" containerID="fb280423af2e85c8442ea856b96820907d570d8d3a31d3ca027951d826749733" exitCode=0 Apr 21 10:04:06.377137 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.377135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" event={"ID":"788d8cac45308933e675cf575b113182","Type":"ContainerDied","Data":"fb280423af2e85c8442ea856b96820907d570d8d3a31d3ca027951d826749733"} Apr 21 10:04:06.379157 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.378846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9nzkd" event={"ID":"15e1aa47-50b2-4765-a870-9c646ae4fb01","Type":"ContainerStarted","Data":"29c3b8cbcf7534386f4a044766bfa010976e648daba34f3a3ee0e08a2d9311ff"} Apr 21 10:04:06.381684 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.381664 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:04:06.382106 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.382056 2575 generic.go:358] "Generic (PLEG): container finished" podID="78e28615-5373-42a5-a30d-cd814a0943b4" containerID="74063df57d667419fec4b06bc2e2df6bd8168aa45aebf46950646130ba0a89a1" exitCode=1 Apr 21 10:04:06.382106 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.382091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerDied","Data":"74063df57d667419fec4b06bc2e2df6bd8168aa45aebf46950646130ba0a89a1"} Apr 21 10:04:06.382106 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.382114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"07135ad09a19c465948eb3f9d12ccf3362c3dac7d56485b89d41f833b0853e72"} Apr 21 10:04:06.382106 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.382125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"731ff220554f2f333cf9b8be58ab94317432cc210398826eee8346aa3bc16d92"} Apr 21 10:04:06.382106 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.382134 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"932c65db8970fc4b53317c036bc1777798a3c3416dd6d83ae99f314ee6ea079f"} Apr 21 10:04:06.382106 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.382142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"a22054f67e4266fa821d680c0ddc9c8fb94810bc3e6d64d1f84dfe7d5789f741"} Apr 21 10:04:06.383663 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.383637 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xdnxh" event={"ID":"99b900e4-2c7f-4af0-bb0e-3e9eef7571a5","Type":"ContainerStarted","Data":"d405f492dcbdb234d3c7ff2e1770ac0b0f11ec9c473610ef7f74c3458dec404b"} Apr 21 10:04:06.404807 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.404763 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7p7gp" podStartSLOduration=2.936333417 podStartE2EDuration="20.404748412s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.46110356 +0000 UTC m=+1.845942464" lastFinishedPulling="2026-04-21 10:04:04.929518549 +0000 UTC m=+19.314357459" observedRunningTime="2026-04-21 10:04:06.389461413 +0000 UTC m=+20.774300340" watchObservedRunningTime="2026-04-21 10:04:06.404748412 +0000 UTC m=+20.789587383" Apr 21 10:04:06.448627 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.448565 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xdnxh" podStartSLOduration=2.946989918 podStartE2EDuration="20.448550666s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.469649774 +0000 UTC m=+1.854488692" lastFinishedPulling="2026-04-21 10:04:04.971210532 +0000 UTC m=+19.356049440" observedRunningTime="2026-04-21 10:04:06.448423853 +0000 UTC m=+20.833262779" watchObservedRunningTime="2026-04-21 10:04:06.448550666 +0000 UTC m=+20.833389593" Apr 21 10:04:06.479232 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.479047 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qlmqr" podStartSLOduration=3.026609266 podStartE2EDuration="20.47903401s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.477095903 +0000 UTC m=+1.861934809" lastFinishedPulling="2026-04-21 10:04:04.929520639 +0000 UTC m=+19.314359553" observedRunningTime="2026-04-21 10:04:06.463419705 +0000 UTC m=+20.848258630" watchObservedRunningTime="2026-04-21 10:04:06.47903401 +0000 UTC m=+20.863872935" Apr 21 10:04:06.479389 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.479363 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9nzkd" podStartSLOduration=3.000825905 podStartE2EDuration="20.479353795s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.486335854 +0000 UTC m=+1.871174758" lastFinishedPulling="2026-04-21 10:04:04.964863731 +0000 UTC m=+19.349702648" observedRunningTime="2026-04-21 10:04:06.478704662 +0000 UTC m=+20.863543588" watchObservedRunningTime="2026-04-21 10:04:06.479353795 +0000 UTC m=+20.864192720" Apr 21 10:04:06.785998 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:06.785960 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 10:04:07.167006 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:07.166864 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T10:04:06.785981169Z","UUID":"5c690a6c-a1ec-4ddd-8b06-bcd564f1d053","Handler":null,"Name":"","Endpoint":""} Apr 21 10:04:07.168769 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:07.168744 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 10:04:07.168926 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:07.168778 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 10:04:07.218797 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:07.218686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:07.218797 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:07.218730 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:07.219017 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:07.218815 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:07.219017 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:07.218905 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:07.388432 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:07.388387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" event={"ID":"788d8cac45308933e675cf575b113182","Type":"ContainerStarted","Data":"0cab10c3d3f30192e46c1e032627b85319fcded31bb09f0b8614fad2960e6864"} Apr 21 10:04:07.390254 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:07.390226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" event={"ID":"284c2d4a-a157-4bc4-a0db-be9046acd561","Type":"ContainerStarted","Data":"eacbebdf4aa03e9a7f7c834398ce679cd07fe050d6168b27ac259f48883006b5"} Apr 21 10:04:07.403139 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:07.403090 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-205.ec2.internal" podStartSLOduration=21.403073471 podStartE2EDuration="21.403073471s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:04:07.402282246 +0000 UTC m=+21.787121172" watchObservedRunningTime="2026-04-21 10:04:07.403073471 +0000 UTC m=+21.787912398" Apr 21 10:04:08.219107 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:08.219072 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:08.219268 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:08.219195 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:08.384305 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:08.384267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:08.384503 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:08.384414 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:08.384503 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:08.384475 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret podName:43b65a7c-65a4-4a35-8d96-82a1e1a9288d nodeName:}" failed. No retries permitted until 2026-04-21 10:04:24.384459936 +0000 UTC m=+38.769298855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret") pod "global-pull-secret-syncer-m5ktd" (UID: "43b65a7c-65a4-4a35-8d96-82a1e1a9288d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:08.395326 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:08.395300 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:04:08.395840 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:08.395689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"f0665b4760a15b162cd2433459467b9822c937a81b0d6eb78141822302787d36"} Apr 21 10:04:08.397426 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:08.397397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" event={"ID":"284c2d4a-a157-4bc4-a0db-be9046acd561","Type":"ContainerStarted","Data":"6cc4aa160be64c84b797e4bcbda882e7bb7cab494daf4e81aafca4a6e22e5fdb"} Apr 21 10:04:08.419142 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:08.419099 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cks8t" podStartSLOduration=2.358501986 podStartE2EDuration="22.41908561s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.454098419 +0000 UTC m=+1.838937329" lastFinishedPulling="2026-04-21 10:04:07.514682041 +0000 UTC m=+21.899520953" observedRunningTime="2026-04-21 10:04:08.418414583 +0000 UTC m=+22.803253510" watchObservedRunningTime="2026-04-21 10:04:08.41908561 +0000 UTC m=+22.803924537" Apr 21 10:04:09.218207 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:09.218165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:09.218394 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:09.218282 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:09.218394 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:09.218311 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:09.218394 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:09.218384 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:09.723225 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:09.723186 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:04:09.724154 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:09.724135 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:04:10.218822 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:10.218697 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:10.218955 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:10.218909 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:10.404707 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:10.404681 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:04:10.405155 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:10.405121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"3337ea971fd552d2c27022acda3aa1f67ce2a901f8b8e5cb02616f64a3008a92"} Apr 21 10:04:10.405430 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:10.405407 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:04:10.405630 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:10.405614 2575 scope.go:117] "RemoveContainer" containerID="74063df57d667419fec4b06bc2e2df6bd8168aa45aebf46950646130ba0a89a1" Apr 21 10:04:10.405904 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:10.405870 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qlmqr" Apr 21 10:04:11.218965 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.218936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:11.219361 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.218976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:11.219361 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:11.219029 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:11.219361 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:11.219156 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:11.410586 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.410557 2575 generic.go:358] "Generic (PLEG): container finished" podID="d4d5b62e-5966-4d1f-b4ca-61a8acf47843" containerID="87fad505ea98cc49f0cd9359711fcab3034a4d443622a27ade4ed6350d458b86" exitCode=0 Apr 21 10:04:11.410781 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.410656 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerDied","Data":"87fad505ea98cc49f0cd9359711fcab3034a4d443622a27ade4ed6350d458b86"} Apr 21 10:04:11.414799 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.414776 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:04:11.415143 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.415118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" event={"ID":"78e28615-5373-42a5-a30d-cd814a0943b4","Type":"ContainerStarted","Data":"01a2c2c05a975cbba0c5631bf8e6d463130ef5f4227a42d667786657e957a011"} Apr 21 10:04:11.415263 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.415248 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 10:04:11.415552 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.415531 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:04:11.415638 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.415564 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:04:11.433072 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.433047 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:04:11.433435 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.433419 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:04:11.459242 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:11.459141 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" podStartSLOduration=7.946738165 podStartE2EDuration="25.459124557s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.472781194 +0000 UTC m=+1.857620098" lastFinishedPulling="2026-04-21 10:04:04.985167583 +0000 UTC m=+19.370006490" observedRunningTime="2026-04-21 10:04:11.458639404 +0000 UTC m=+25.843478331" watchObservedRunningTime="2026-04-21 10:04:11.459124557 +0000 UTC m=+25.843963483" Apr 21 10:04:12.172792 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:12.172521 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4kjcv"] Apr 21 10:04:12.173006 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:12.172900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:12.173073 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:12.173003 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:12.173287 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:12.173260 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m5ktd"] Apr 21 10:04:12.173452 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:12.173379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:12.173512 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:12.173477 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:12.173783 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:12.173766 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z7xfz"] Apr 21 10:04:12.173851 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:12.173839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:12.173953 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:12.173937 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:12.416408 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:12.416344 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 10:04:13.419487 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:13.419446 2575 generic.go:358] "Generic (PLEG): container finished" podID="d4d5b62e-5966-4d1f-b4ca-61a8acf47843" containerID="a2213c64aa1fba9856ff9d6397946579d9d5632ab3cebe4e5d018edf99dc5f30" exitCode=0 Apr 21 10:04:13.419967 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:13.419526 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerDied","Data":"a2213c64aa1fba9856ff9d6397946579d9d5632ab3cebe4e5d018edf99dc5f30"} Apr 21 10:04:13.419967 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:13.419721 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 10:04:14.218508 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:14.218476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:14.218689 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:14.218484 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:14.218689 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:14.218626 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:14.218689 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:14.218642 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:14.218850 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:14.218771 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:14.218850 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:14.218803 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:15.425130 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:15.424942 2575 generic.go:358] "Generic (PLEG): container finished" podID="d4d5b62e-5966-4d1f-b4ca-61a8acf47843" containerID="45cb3ed7a0e47797c63a5d88fdcee038117a2e38df9e53b8a4038d6bf9d15c3d" exitCode=0 Apr 21 10:04:15.425130 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:15.425027 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerDied","Data":"45cb3ed7a0e47797c63a5d88fdcee038117a2e38df9e53b8a4038d6bf9d15c3d"} Apr 21 10:04:15.662262 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:15.662001 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:04:15.662262 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:15.662249 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 10:04:15.672291 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:15.672270 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w88kn" Apr 21 10:04:16.220321 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:16.220294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:16.220490 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:16.220414 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:04:16.220490 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:16.220476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:16.220605 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:16.220578 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjcv" podUID="983ab4c3-e7a5-4914-b247-d139ed1699ad" Apr 21 10:04:16.220670 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:16.220623 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:16.220718 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:16.220695 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m5ktd" podUID="43b65a7c-65a4-4a35-8d96-82a1e1a9288d" Apr 21 10:04:17.455304 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.455273 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-205.ec2.internal" event="NodeReady" Apr 21 10:04:17.455716 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.455395 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 10:04:17.513843 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.513813 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pwrsm"] Apr 21 10:04:17.529637 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.529609 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-968lb"] Apr 21 10:04:17.529809 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.529785 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.534841 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.534700 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 10:04:17.535127 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.534708 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nqzpr\"" Apr 21 10:04:17.535127 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.535029 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 10:04:17.543512 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.543479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pwrsm"] Apr 21 10:04:17.543613 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.543569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:17.547052 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.546969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 10:04:17.547052 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.546985 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 10:04:17.547242 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.547225 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 10:04:17.547312 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.547225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2fp54\"" Apr 21 10:04:17.553870 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.553849 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-968lb"] Apr 21 10:04:17.654729 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.654700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4004e34e-ae2c-4815-bea5-806b5e15036f-config-volume\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.654912 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.654737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qcst\" (UniqueName: \"kubernetes.io/projected/4004e34e-ae2c-4815-bea5-806b5e15036f-kube-api-access-7qcst\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.654912 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.654754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4004e34e-ae2c-4815-bea5-806b5e15036f-tmp-dir\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.654912 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.654840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.654912 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.654861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:17.655083 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.654947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd2lk\" (UniqueName: \"kubernetes.io/projected/1532c4fc-8a96-448b-954a-32eca5cac710-kube-api-access-dd2lk\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:17.756159 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.756078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4004e34e-ae2c-4815-bea5-806b5e15036f-config-volume\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.756159 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.756125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qcst\" (UniqueName: \"kubernetes.io/projected/4004e34e-ae2c-4815-bea5-806b5e15036f-kube-api-access-7qcst\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.756159 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.756150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4004e34e-ae2c-4815-bea5-806b5e15036f-tmp-dir\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.756396 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.756338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.756396 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.756379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:17.756513 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:17.756497 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:17.756549 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.756524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd2lk\" (UniqueName: \"kubernetes.io/projected/1532c4fc-8a96-448b-954a-32eca5cac710-kube-api-access-dd2lk\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:17.756586 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:17.756562 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.256543288 +0000 UTC m=+32.641382196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:04:17.756631 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:17.756589 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:17.756664 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:17.756639 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.256622785 +0000 UTC m=+32.641461697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:04:17.756664 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.756638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4004e34e-ae2c-4815-bea5-806b5e15036f-tmp-dir\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.758363 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.758342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4004e34e-ae2c-4815-bea5-806b5e15036f-config-volume\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:17.776498 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.776473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd2lk\" (UniqueName: \"kubernetes.io/projected/1532c4fc-8a96-448b-954a-32eca5cac710-kube-api-access-dd2lk\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:17.776870 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:17.776844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qcst\" (UniqueName: \"kubernetes.io/projected/4004e34e-ae2c-4815-bea5-806b5e15036f-kube-api-access-7qcst\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:18.218462 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.218426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:18.218623 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.218426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:18.219168 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.218434 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:18.222168 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.222146 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 10:04:18.222279 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.222191 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:18.222279 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.222222 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w68bg\"" Apr 21 10:04:18.222279 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.222249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfhd5\"" Apr 21 10:04:18.222438 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.222314 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:18.222438 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.222402 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:18.259729 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.259710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:18.259841 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.259737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:18.259915 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:18.259848 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:18.259915 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:18.259908 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:19.259895413 +0000 UTC m=+33.644734317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:04:18.260032 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:18.259848 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:18.260032 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:18.259963 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:04:19.259950307 +0000 UTC m=+33.644789218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:04:18.863697 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.863664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:18.864194 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:18.863783 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:04:18.864194 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:18.863853 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:04:50.863831689 +0000 UTC m=+65.248670593 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : secret "metrics-daemon-secret" not found Apr 21 10:04:18.964580 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.964542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:18.967783 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:18.967758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd98l\" (UniqueName: \"kubernetes.io/projected/983ab4c3-e7a5-4914-b247-d139ed1699ad-kube-api-access-pd98l\") pod \"network-check-target-4kjcv\" (UID: \"983ab4c3-e7a5-4914-b247-d139ed1699ad\") " pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:19.135792 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:19.135713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:19.267168 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:19.267123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:19.267361 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:19.267178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:19.267361 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:19.267293 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:19.267361 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:19.267309 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:19.267537 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:19.267381 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:04:21.267359219 +0000 UTC m=+35.652198125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:04:19.267537 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:19.267416 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:21.267395459 +0000 UTC m=+35.652234366 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:04:21.093067 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:21.092905 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4kjcv"] Apr 21 10:04:21.166671 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:04:21.166639 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod983ab4c3_e7a5_4914_b247_d139ed1699ad.slice/crio-eeaaf2d02083555985265e1f24ada9872cba52ba93f6f1d2b2bc767741be3a91 WatchSource:0}: Error finding container eeaaf2d02083555985265e1f24ada9872cba52ba93f6f1d2b2bc767741be3a91: Status 404 returned error can't find the container with id eeaaf2d02083555985265e1f24ada9872cba52ba93f6f1d2b2bc767741be3a91 Apr 21 10:04:21.283870 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:21.283834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:21.284031 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:21.283899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:21.284101 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:21.284029 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:21.284101 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:21.284035 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:21.284101 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:21.284097 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.284077191 +0000 UTC m=+39.668916096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:04:21.284248 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:21.284117 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.284107446 +0000 UTC m=+39.668946357 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:04:21.438365 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:21.438331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4kjcv" event={"ID":"983ab4c3-e7a5-4914-b247-d139ed1699ad","Type":"ContainerStarted","Data":"eeaaf2d02083555985265e1f24ada9872cba52ba93f6f1d2b2bc767741be3a91"} Apr 21 10:04:21.440975 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:21.440937 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerStarted","Data":"0320839233f8585e9c4cf7e722edf43abfff77c52d92f6a6ac4c58116a7dfcec"} Apr 21 10:04:22.445422 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:22.445384 2575 generic.go:358] "Generic (PLEG): container finished" podID="d4d5b62e-5966-4d1f-b4ca-61a8acf47843" containerID="0320839233f8585e9c4cf7e722edf43abfff77c52d92f6a6ac4c58116a7dfcec" exitCode=0 Apr 21 10:04:22.446093 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:22.445427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerDied","Data":"0320839233f8585e9c4cf7e722edf43abfff77c52d92f6a6ac4c58116a7dfcec"} Apr 21 10:04:23.450450 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:23.450419 2575 generic.go:358] "Generic (PLEG): container finished" podID="d4d5b62e-5966-4d1f-b4ca-61a8acf47843" containerID="99927f2d07cbe0dc346ae681ef26e57578d8ae9cd27e4e0da5fc40fae03d8d78" exitCode=0 Apr 21 10:04:23.451003 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:23.450467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerDied","Data":"99927f2d07cbe0dc346ae681ef26e57578d8ae9cd27e4e0da5fc40fae03d8d78"} Apr 21 10:04:24.407746 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.407721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:24.410562 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.410541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/43b65a7c-65a4-4a35-8d96-82a1e1a9288d-original-pull-secret\") pod \"global-pull-secret-syncer-m5ktd\" (UID: \"43b65a7c-65a4-4a35-8d96-82a1e1a9288d\") " pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:24.454824 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.454800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xmljw" event={"ID":"d4d5b62e-5966-4d1f-b4ca-61a8acf47843","Type":"ContainerStarted","Data":"9bb1e4c04d86ccef71484a25f4fa31523b10a62ea7d3ea84082a791a1477853c"} Apr 21 10:04:24.456004 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.455983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4kjcv" event={"ID":"983ab4c3-e7a5-4914-b247-d139ed1699ad","Type":"ContainerStarted","Data":"90c218f2bb80cd2326058edd3414587d3399f164050dd54cba78faa1560a53f9"} Apr 21 10:04:24.456116 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.456105 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:04:24.484765 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.484265 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xmljw" podStartSLOduration=4.698520802 podStartE2EDuration="38.48425001s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:03:47.418317594 +0000 UTC m=+1.803156498" lastFinishedPulling="2026-04-21 10:04:21.204046802 +0000 UTC m=+35.588885706" observedRunningTime="2026-04-21 10:04:24.482775409 +0000 UTC m=+38.867614334" watchObservedRunningTime="2026-04-21 10:04:24.48425001 +0000 UTC m=+38.869088935" Apr 21 10:04:24.504116 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.504043 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4kjcv" podStartSLOduration=35.431295192 podStartE2EDuration="38.504031525s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:04:21.18334405 +0000 UTC m=+35.568182954" lastFinishedPulling="2026-04-21 10:04:24.256080379 +0000 UTC m=+38.640919287" observedRunningTime="2026-04-21 10:04:24.503339131 +0000 UTC m=+38.888178057" watchObservedRunningTime="2026-04-21 10:04:24.504031525 +0000 UTC m=+38.888870450" Apr 21 10:04:24.529943 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.529920 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m5ktd" Apr 21 10:04:24.641792 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:24.641761 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m5ktd"] Apr 21 10:04:24.644324 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:04:24.644302 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43b65a7c_65a4_4a35_8d96_82a1e1a9288d.slice/crio-3ac6e34414b2c587cda1aa98b1451168072adedacaddc63ec59c285dd207f272 WatchSource:0}: Error finding container 3ac6e34414b2c587cda1aa98b1451168072adedacaddc63ec59c285dd207f272: Status 404 returned error can't find the container with id 3ac6e34414b2c587cda1aa98b1451168072adedacaddc63ec59c285dd207f272 Apr 21 10:04:25.313752 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:25.313715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:25.313936 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:25.313760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:25.313936 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:25.313896 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:25.313936 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:25.313901 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:25.314129 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:25.313960 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:33.313941821 +0000 UTC m=+47.698780731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:04:25.314129 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:25.313979 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:04:33.313968581 +0000 UTC m=+47.698807492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:04:25.459396 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:25.459360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m5ktd" event={"ID":"43b65a7c-65a4-4a35-8d96-82a1e1a9288d","Type":"ContainerStarted","Data":"3ac6e34414b2c587cda1aa98b1451168072adedacaddc63ec59c285dd207f272"} Apr 21 10:04:28.466224 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:28.466182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m5ktd" event={"ID":"43b65a7c-65a4-4a35-8d96-82a1e1a9288d","Type":"ContainerStarted","Data":"3b969751b733f862dbbeed88c954ed0eb6aa071368e8417e4af0a8a2fca5b747"} Apr 21 10:04:28.482972 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:28.482918 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m5ktd" podStartSLOduration=32.853545568 podStartE2EDuration="36.482900461s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:04:24.645984519 +0000 UTC m=+39.030823422" lastFinishedPulling="2026-04-21 10:04:28.275339407 +0000 UTC m=+42.660178315" observedRunningTime="2026-04-21 10:04:28.482384983 +0000 UTC m=+42.867223908" watchObservedRunningTime="2026-04-21 10:04:28.482900461 +0000 UTC m=+42.867739384" Apr 21 10:04:33.363044 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:33.363002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:33.363044 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:33.363045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:33.363442 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:33.363140 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:33.363442 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:33.363141 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:33.363442 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:33.363190 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:49.363176537 +0000 UTC m=+63.748015442 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:04:33.363442 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:33.363204 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:04:49.363197614 +0000 UTC m=+63.748036517 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:04:49.368728 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:49.368680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:04:49.368728 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:49.368730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:04:49.369305 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:49.368848 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:49.369305 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:49.368895 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:49.369305 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:49.368942 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:05:21.368922037 +0000 UTC m=+95.753760942 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:04:49.369305 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:49.368958 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:21.36895177 +0000 UTC m=+95.753790674 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:04:50.876559 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:50.876515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:04:50.876983 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:50.876658 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:04:50.876983 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:04:50.876720 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:05:54.876704685 +0000 UTC m=+129.261543589 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : secret "metrics-daemon-secret" not found Apr 21 10:04:55.462697 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:04:55.462560 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4kjcv" Apr 21 10:05:21.371201 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:05:21.371157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:05:21.371201 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:05:21.371199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:05:21.371767 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:05:21.371301 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:05:21.371767 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:05:21.371326 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:05:21.371767 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:05:21.371371 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:06:25.371353018 +0000 UTC m=+159.756191923 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:05:21.371767 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:05:21.371385 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:25.371379017 +0000 UTC m=+159.756217921 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:05:54.896250 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:05:54.896199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:05:54.896768 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:05:54.896338 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:05:54.896768 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:05:54.896417 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs podName:1100ece6-afda-453b-8595-490c94dbb90d nodeName:}" failed. No retries permitted until 2026-04-21 10:07:56.896399989 +0000 UTC m=+251.281238892 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs") pod "network-metrics-daemon-z7xfz" (UID: "1100ece6-afda-453b-8595-490c94dbb90d") : secret "metrics-daemon-secret" not found Apr 21 10:06:15.417977 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.417942 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd"] Apr 21 10:06:15.420842 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.420823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.426316 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.426289 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gzcbd"] Apr 21 10:06:15.427082 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.427031 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cwt7g\"" Apr 21 10:06:15.427247 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.427187 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 10:06:15.427247 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.427223 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 10:06:15.427360 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.427229 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 10:06:15.427360 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.427232 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 10:06:15.429369 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.429348 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5578cc5b5b-gktb6"] Apr 21 10:06:15.429535 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.429507 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.431969 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.431948 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-9blcx\"" Apr 21 10:06:15.432295 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.432277 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 10:06:15.432506 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.432488 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 10:06:15.432596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.432488 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 10:06:15.432736 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.432713 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd"] Apr 21 10:06:15.432852 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.432814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.433300 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.433283 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 10:06:15.439664 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.439643 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 10:06:15.440662 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.440601 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-s6fzq\"" Apr 21 10:06:15.440732 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.440670 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 10:06:15.441101 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.440905 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 10:06:15.441270 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.441232 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 10:06:15.441395 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.441366 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 10:06:15.441465 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.441429 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 10:06:15.441529 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.441504 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gzcbd"] Apr 21 10:06:15.442084 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.442061 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5578cc5b5b-gktb6"] Apr 21 10:06:15.444628 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.444610 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 10:06:15.525865 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.525844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggtbw\" (UniqueName: \"kubernetes.io/projected/95ed4025-e077-4170-b387-681d37c22925-kube-api-access-ggtbw\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.525971 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.525870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.525971 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.525905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r85h\" (UniqueName: \"kubernetes.io/projected/dc8f7796-af2d-4cc6-8807-bb8dcf779711-kube-api-access-5r85h\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.525971 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.525927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.526074 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.525982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95ed4025-e077-4170-b387-681d37c22925-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.526074 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95ed4025-e077-4170-b387-681d37c22925-tmp\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.526074 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-default-certificate\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.526074 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/95ed4025-e077-4170-b387-681d37c22925-snapshots\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.526074 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ed4025-e077-4170-b387-681d37c22925-serving-cert\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.526278 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526080 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95ed4025-e077-4170-b387-681d37c22925-service-ca-bundle\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.526278 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-stats-auth\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.526278 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.526278 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc8f7796-af2d-4cc6-8807-bb8dcf779711-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.526278 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.526221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2n7j\" (UniqueName: \"kubernetes.io/projected/d9d0fea7-43c5-4e4f-a753-45edf112980c-kube-api-access-f2n7j\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.627421 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95ed4025-e077-4170-b387-681d37c22925-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.627524 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95ed4025-e077-4170-b387-681d37c22925-tmp\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.627586 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-default-certificate\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.627586 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/95ed4025-e077-4170-b387-681d37c22925-snapshots\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.627691 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ed4025-e077-4170-b387-681d37c22925-serving-cert\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.627748 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95ed4025-e077-4170-b387-681d37c22925-service-ca-bundle\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.627748 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-stats-auth\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.627748 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.627916 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc8f7796-af2d-4cc6-8807-bb8dcf779711-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.627916 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2n7j\" (UniqueName: \"kubernetes.io/projected/d9d0fea7-43c5-4e4f-a753-45edf112980c-kube-api-access-f2n7j\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.627916 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95ed4025-e077-4170-b387-681d37c22925-tmp\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.627916 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggtbw\" (UniqueName: \"kubernetes.io/projected/95ed4025-e077-4170-b387-681d37c22925-kube-api-access-ggtbw\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.627916 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.627916 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:15.627859 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:15.627916 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r85h\" (UniqueName: \"kubernetes.io/projected/dc8f7796-af2d-4cc6-8807-bb8dcf779711-kube-api-access-5r85h\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.628266 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:15.627947 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.127929841 +0000 UTC m=+150.512768748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : secret "router-metrics-certs-default" not found Apr 21 10:06:15.628266 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.627968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.628266 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:15.628046 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:15.628266 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:15.628082 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls podName:dc8f7796-af2d-4cc6-8807-bb8dcf779711 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.128069653 +0000 UTC m=+150.512908556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5d7nd" (UID: "dc8f7796-af2d-4cc6-8807-bb8dcf779711") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:15.628266 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.628149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/95ed4025-e077-4170-b387-681d37c22925-snapshots\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.628532 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.628372 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95ed4025-e077-4170-b387-681d37c22925-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.628532 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:15.628392 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.128377057 +0000 UTC m=+150.513215963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:15.628532 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.628507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95ed4025-e077-4170-b387-681d37c22925-service-ca-bundle\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.628656 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.628537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc8f7796-af2d-4cc6-8807-bb8dcf779711-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.630181 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.630157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ed4025-e077-4170-b387-681d37c22925-serving-cert\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.630456 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.630438 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-stats-auth\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.630519 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.630441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-default-certificate\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.636974 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.636952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2n7j\" (UniqueName: \"kubernetes.io/projected/d9d0fea7-43c5-4e4f-a753-45edf112980c-kube-api-access-f2n7j\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:15.637471 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.637454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggtbw\" (UniqueName: \"kubernetes.io/projected/95ed4025-e077-4170-b387-681d37c22925-kube-api-access-ggtbw\") pod \"insights-operator-585dfdc468-gzcbd\" (UID: \"95ed4025-e077-4170-b387-681d37c22925\") " pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.637539 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.637491 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r85h\" (UniqueName: \"kubernetes.io/projected/dc8f7796-af2d-4cc6-8807-bb8dcf779711-kube-api-access-5r85h\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:15.745792 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.745745 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gzcbd" Apr 21 10:06:15.858650 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:15.858623 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gzcbd"] Apr 21 10:06:15.861894 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:15.861853 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95ed4025_e077_4170_b387_681d37c22925.slice/crio-782b072cc5646009a13e0e03ffbbc07e9367dc0b8b380dbe3404ab1a6262da3f WatchSource:0}: Error finding container 782b072cc5646009a13e0e03ffbbc07e9367dc0b8b380dbe3404ab1a6262da3f: Status 404 returned error can't find the container with id 782b072cc5646009a13e0e03ffbbc07e9367dc0b8b380dbe3404ab1a6262da3f Apr 21 10:06:16.131318 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:16.131218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:16.131318 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:16.131276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:16.131533 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:16.131361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:16.131533 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:16.131390 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:17.131363032 +0000 UTC m=+151.516201960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:16.131533 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:16.131443 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:16.131533 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:16.131445 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:16.131533 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:16.131494 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:17.131478155 +0000 UTC m=+151.516317061 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : secret "router-metrics-certs-default" not found Apr 21 10:06:16.131792 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:16.131534 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls podName:dc8f7796-af2d-4cc6-8807-bb8dcf779711 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:17.131520484 +0000 UTC m=+151.516359403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5d7nd" (UID: "dc8f7796-af2d-4cc6-8807-bb8dcf779711") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:16.671066 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:16.670993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gzcbd" event={"ID":"95ed4025-e077-4170-b387-681d37c22925","Type":"ContainerStarted","Data":"782b072cc5646009a13e0e03ffbbc07e9367dc0b8b380dbe3404ab1a6262da3f"} Apr 21 10:06:17.138208 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:17.138116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:17.138208 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:17.138182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:17.138445 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:17.138212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:17.138445 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:17.138278 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:17.138445 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:17.138318 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:17.138445 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:17.138332 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:19.13831092 +0000 UTC m=+153.523149836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:17.138445 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:17.138355 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:19.138344365 +0000 UTC m=+153.523183269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : secret "router-metrics-certs-default" not found Apr 21 10:06:17.138445 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:17.138369 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls podName:dc8f7796-af2d-4cc6-8807-bb8dcf779711 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:19.138361323 +0000 UTC m=+153.523200228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5d7nd" (UID: "dc8f7796-af2d-4cc6-8807-bb8dcf779711") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:17.677888 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:17.677852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gzcbd" event={"ID":"95ed4025-e077-4170-b387-681d37c22925","Type":"ContainerStarted","Data":"4dca6e9d42ae6544ca0a8c2777b11fa3a4b5fa0505b2beb5a4d84cca38c85fb9"} Apr 21 10:06:17.696764 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:17.696721 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-gzcbd" podStartSLOduration=1.180745061 podStartE2EDuration="2.696709525s" podCreationTimestamp="2026-04-21 10:06:15 +0000 UTC" firstStartedPulling="2026-04-21 10:06:15.863588868 +0000 UTC m=+150.248427772" lastFinishedPulling="2026-04-21 10:06:17.379553327 +0000 UTC m=+151.764392236" observedRunningTime="2026-04-21 10:06:17.695186984 +0000 UTC m=+152.080025923" watchObservedRunningTime="2026-04-21 10:06:17.696709525 +0000 UTC m=+152.081548452" Apr 21 10:06:19.149827 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:19.149789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:19.150294 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:19.149866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:19.150294 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:19.149927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:19.150294 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:19.149976 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:19.150294 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:19.150033 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:19.150294 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:19.150055 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:23.150032056 +0000 UTC m=+157.534870975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:19.150294 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:19.150078 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:23.150065518 +0000 UTC m=+157.534904422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : secret "router-metrics-certs-default" not found Apr 21 10:06:19.150294 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:19.150093 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls podName:dc8f7796-af2d-4cc6-8807-bb8dcf779711 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:23.150084699 +0000 UTC m=+157.534923609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5d7nd" (UID: "dc8f7796-af2d-4cc6-8807-bb8dcf779711") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:20.541484 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:20.541444 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pwrsm" podUID="4004e34e-ae2c-4815-bea5-806b5e15036f" Apr 21 10:06:20.553769 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:20.553735 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-968lb" podUID="1532c4fc-8a96-448b-954a-32eca5cac710" Apr 21 10:06:20.683501 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:20.683471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:06:20.683630 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:20.683479 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pwrsm" Apr 21 10:06:20.701641 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:20.701620 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9nzkd_15e1aa47-50b2-4765-a870-9c646ae4fb01/dns-node-resolver/0.log" Apr 21 10:06:21.242563 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:21.242529 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-z7xfz" podUID="1100ece6-afda-453b-8595-490c94dbb90d" Apr 21 10:06:21.705224 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:21.705195 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7p7gp_bf91954e-2bd6-4597-b660-140925e88c87/node-ca/0.log" Apr 21 10:06:21.845804 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:21.845770 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b"] Apr 21 10:06:21.848778 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:21.848760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b" Apr 21 10:06:21.854253 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:21.854230 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mdckq\"" Apr 21 10:06:21.863675 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:21.863647 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b"] Apr 21 10:06:21.972361 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:21.972270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbb6m\" (UniqueName: \"kubernetes.io/projected/08e569b3-eeff-4b72-b452-72b801cbdb72-kube-api-access-sbb6m\") pod \"network-check-source-8894fc9bd-gpv7b\" (UID: \"08e569b3-eeff-4b72-b452-72b801cbdb72\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b" Apr 21 10:06:22.073659 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:22.073610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbb6m\" (UniqueName: \"kubernetes.io/projected/08e569b3-eeff-4b72-b452-72b801cbdb72-kube-api-access-sbb6m\") pod \"network-check-source-8894fc9bd-gpv7b\" (UID: \"08e569b3-eeff-4b72-b452-72b801cbdb72\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b" Apr 21 10:06:22.083353 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:22.083327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbb6m\" (UniqueName: \"kubernetes.io/projected/08e569b3-eeff-4b72-b452-72b801cbdb72-kube-api-access-sbb6m\") pod \"network-check-source-8894fc9bd-gpv7b\" (UID: \"08e569b3-eeff-4b72-b452-72b801cbdb72\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b" Apr 21 10:06:22.157159 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:22.157118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b" Apr 21 10:06:22.274158 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:22.274131 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b"] Apr 21 10:06:22.275297 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:22.275272 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e569b3_eeff_4b72_b452_72b801cbdb72.slice/crio-df8f76044dec0e529fca4a70e94acdc81bf11ec498f46e249b06b28f66e1ada7 WatchSource:0}: Error finding container df8f76044dec0e529fca4a70e94acdc81bf11ec498f46e249b06b28f66e1ada7: Status 404 returned error can't find the container with id df8f76044dec0e529fca4a70e94acdc81bf11ec498f46e249b06b28f66e1ada7 Apr 21 10:06:22.688333 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:22.688303 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b" event={"ID":"08e569b3-eeff-4b72-b452-72b801cbdb72","Type":"ContainerStarted","Data":"bc42c451d03859756003256eb0d78b9fb670d06840ef0133afc52b3e5eb5b13e"} Apr 21 10:06:22.688481 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:22.688336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b" event={"ID":"08e569b3-eeff-4b72-b452-72b801cbdb72","Type":"ContainerStarted","Data":"df8f76044dec0e529fca4a70e94acdc81bf11ec498f46e249b06b28f66e1ada7"} Apr 21 10:06:22.705233 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:22.705191 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gpv7b" podStartSLOduration=1.705172402 podStartE2EDuration="1.705172402s" podCreationTimestamp="2026-04-21 10:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:22.704890416 +0000 UTC m=+157.089729334" watchObservedRunningTime="2026-04-21 10:06:22.705172402 +0000 UTC m=+157.090011327" Apr 21 10:06:23.183087 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:23.183053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:23.183210 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:23.183120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:23.183210 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:23.183161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:23.183306 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:23.183216 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:23.183306 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:23.183277 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:31.183256518 +0000 UTC m=+165.568095432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : secret "router-metrics-certs-default" not found Apr 21 10:06:23.183306 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:23.183280 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:23.183306 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:23.183299 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle podName:d9d0fea7-43c5-4e4f-a753-45edf112980c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:31.183286745 +0000 UTC m=+165.568125660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle") pod "router-default-5578cc5b5b-gktb6" (UID: "d9d0fea7-43c5-4e4f-a753-45edf112980c") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:23.183431 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:23.183341 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls podName:dc8f7796-af2d-4cc6-8807-bb8dcf779711 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:31.183325607 +0000 UTC m=+165.568164513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5d7nd" (UID: "dc8f7796-af2d-4cc6-8807-bb8dcf779711") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:25.400669 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:25.400568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:06:25.400669 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:25.400607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:06:25.401189 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:25.400724 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:06:25.401189 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:25.400796 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls podName:4004e34e-ae2c-4815-bea5-806b5e15036f nodeName:}" failed. No retries permitted until 2026-04-21 10:08:27.40077916 +0000 UTC m=+281.785618064 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls") pod "dns-default-pwrsm" (UID: "4004e34e-ae2c-4815-bea5-806b5e15036f") : secret "dns-default-metrics-tls" not found Apr 21 10:06:25.401189 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:25.400725 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:06:25.401189 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:25.400897 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert podName:1532c4fc-8a96-448b-954a-32eca5cac710 nodeName:}" failed. No retries permitted until 2026-04-21 10:08:27.400862055 +0000 UTC m=+281.785700960 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert") pod "ingress-canary-968lb" (UID: "1532c4fc-8a96-448b-954a-32eca5cac710") : secret "canary-serving-cert" not found Apr 21 10:06:31.242421 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.242383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:31.242802 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.242430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:31.242802 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.242455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:31.242802 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:31.242562 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:31.242802 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:06:31.242615 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls podName:dc8f7796-af2d-4cc6-8807-bb8dcf779711 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:47.242600939 +0000 UTC m=+181.627439844 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5d7nd" (UID: "dc8f7796-af2d-4cc6-8807-bb8dcf779711") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:31.243080 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.243059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9d0fea7-43c5-4e4f-a753-45edf112980c-service-ca-bundle\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:31.244652 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.244626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9d0fea7-43c5-4e4f-a753-45edf112980c-metrics-certs\") pod \"router-default-5578cc5b5b-gktb6\" (UID: \"d9d0fea7-43c5-4e4f-a753-45edf112980c\") " pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:31.350725 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.350696 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:31.466996 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.466969 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5578cc5b5b-gktb6"] Apr 21 10:06:31.469501 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:31.469475 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9d0fea7_43c5_4e4f_a753_45edf112980c.slice/crio-040685ce962c26073e368312d4b9075b400f2de31303a210f26127316046e4a0 WatchSource:0}: Error finding container 040685ce962c26073e368312d4b9075b400f2de31303a210f26127316046e4a0: Status 404 returned error can't find the container with id 040685ce962c26073e368312d4b9075b400f2de31303a210f26127316046e4a0 Apr 21 10:06:31.709200 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.709167 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" event={"ID":"d9d0fea7-43c5-4e4f-a753-45edf112980c","Type":"ContainerStarted","Data":"07b9717505841a03aa7d05c0aae6e1a9880f7eb1dc66070f939d1e9523d5627d"} Apr 21 10:06:31.709339 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.709205 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" event={"ID":"d9d0fea7-43c5-4e4f-a753-45edf112980c","Type":"ContainerStarted","Data":"040685ce962c26073e368312d4b9075b400f2de31303a210f26127316046e4a0"} Apr 21 10:06:31.727002 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:31.726965 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" podStartSLOduration=16.726950755 podStartE2EDuration="16.726950755s" podCreationTimestamp="2026-04-21 10:06:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:31.726486158 +0000 UTC m=+166.111325084" watchObservedRunningTime="2026-04-21 10:06:31.726950755 +0000 UTC m=+166.111789684" Apr 21 10:06:32.351302 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:32.351267 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:32.353982 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:32.353959 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:32.713689 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:32.713659 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:32.715001 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:32.714978 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5578cc5b5b-gktb6" Apr 21 10:06:33.218752 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:33.218721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:06:44.415508 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.415465 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6pxnr"] Apr 21 10:06:44.418668 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.418635 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.421580 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.421557 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 10:06:44.421580 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.421568 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 10:06:44.422725 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.422703 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fdjpw\"" Apr 21 10:06:44.429439 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.429414 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6pxnr"] Apr 21 10:06:44.512307 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.512279 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-779dbdc744-bx9cs"] Apr 21 10:06:44.514543 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.514523 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.517207 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.517186 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7p24h\"" Apr 21 10:06:44.517315 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.517186 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 10:06:44.517315 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.517238 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 10:06:44.517315 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.517249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 10:06:44.523194 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.523174 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 10:06:44.527631 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.527601 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779dbdc744-bx9cs"] Apr 21 10:06:44.528740 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.528719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2fb17919-bd4a-48df-a399-d9c623b57ea9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.528830 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.528785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2fb17919-bd4a-48df-a399-d9c623b57ea9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.528909 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.528826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2fb17919-bd4a-48df-a399-d9c623b57ea9-crio-socket\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.528909 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.528854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fb17919-bd4a-48df-a399-d9c623b57ea9-data-volume\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.528909 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.528872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsdl4\" (UniqueName: \"kubernetes.io/projected/2fb17919-bd4a-48df-a399-d9c623b57ea9-kube-api-access-jsdl4\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.629210 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c491f05b-ff00-4798-a36d-4e55bd12f403-trusted-ca\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.629310 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629220 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c491f05b-ff00-4798-a36d-4e55bd12f403-registry-certificates\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.629310 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-registry-tls\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.629310 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fb17919-bd4a-48df-a399-d9c623b57ea9-data-volume\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.629413 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsdl4\" (UniqueName: \"kubernetes.io/projected/2fb17919-bd4a-48df-a399-d9c623b57ea9-kube-api-access-jsdl4\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.629413 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2fxs\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-kube-api-access-h2fxs\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.629478 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2fb17919-bd4a-48df-a399-d9c623b57ea9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.629519 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c491f05b-ff00-4798-a36d-4e55bd12f403-installation-pull-secrets\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.629519 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-bound-sa-token\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.629588 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2fb17919-bd4a-48df-a399-d9c623b57ea9-crio-socket\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.629588 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fb17919-bd4a-48df-a399-d9c623b57ea9-data-volume\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.629588 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c491f05b-ff00-4798-a36d-4e55bd12f403-ca-trust-extracted\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.629689 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629594 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2fb17919-bd4a-48df-a399-d9c623b57ea9-crio-socket\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.629689 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2fb17919-bd4a-48df-a399-d9c623b57ea9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.629689 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c491f05b-ff00-4798-a36d-4e55bd12f403-image-registry-private-configuration\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.629954 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.629933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2fb17919-bd4a-48df-a399-d9c623b57ea9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.631763 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.631741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2fb17919-bd4a-48df-a399-d9c623b57ea9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.640750 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.640731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsdl4\" (UniqueName: \"kubernetes.io/projected/2fb17919-bd4a-48df-a399-d9c623b57ea9-kube-api-access-jsdl4\") pod \"insights-runtime-extractor-6pxnr\" (UID: \"2fb17919-bd4a-48df-a399-d9c623b57ea9\") " pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.727775 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.727754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6pxnr" Apr 21 10:06:44.730438 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.730422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c491f05b-ff00-4798-a36d-4e55bd12f403-trusted-ca\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.730506 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.730448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c491f05b-ff00-4798-a36d-4e55bd12f403-registry-certificates\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.730506 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.730467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-registry-tls\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.730506 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.730488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2fxs\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-kube-api-access-h2fxs\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.730673 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.730653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c491f05b-ff00-4798-a36d-4e55bd12f403-installation-pull-secrets\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.730744 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.730687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-bound-sa-token\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.730744 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.730719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c491f05b-ff00-4798-a36d-4e55bd12f403-ca-trust-extracted\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.730845 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.730765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c491f05b-ff00-4798-a36d-4e55bd12f403-image-registry-private-configuration\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.731225 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.731205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c491f05b-ff00-4798-a36d-4e55bd12f403-ca-trust-extracted\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.731398 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.731373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c491f05b-ff00-4798-a36d-4e55bd12f403-registry-certificates\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.731605 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.731496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c491f05b-ff00-4798-a36d-4e55bd12f403-trusted-ca\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.733336 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.733317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c491f05b-ff00-4798-a36d-4e55bd12f403-image-registry-private-configuration\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.733493 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.733432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-registry-tls\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.733493 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.733462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c491f05b-ff00-4798-a36d-4e55bd12f403-installation-pull-secrets\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.742415 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.742394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-bound-sa-token\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.742918 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.742895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2fxs\" (UniqueName: \"kubernetes.io/projected/c491f05b-ff00-4798-a36d-4e55bd12f403-kube-api-access-h2fxs\") pod \"image-registry-779dbdc744-bx9cs\" (UID: \"c491f05b-ff00-4798-a36d-4e55bd12f403\") " pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.824145 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.824121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:44.845199 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.845165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6pxnr"] Apr 21 10:06:44.847997 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:44.847968 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb17919_bd4a_48df_a399_d9c623b57ea9.slice/crio-0af1086fd14595a9eb7cfd94da1a6f615a2c8d482b42e498d2f10c5042dd07e6 WatchSource:0}: Error finding container 0af1086fd14595a9eb7cfd94da1a6f615a2c8d482b42e498d2f10c5042dd07e6: Status 404 returned error can't find the container with id 0af1086fd14595a9eb7cfd94da1a6f615a2c8d482b42e498d2f10c5042dd07e6 Apr 21 10:06:44.941319 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:44.941291 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779dbdc744-bx9cs"] Apr 21 10:06:44.944240 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:44.944210 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc491f05b_ff00_4798_a36d_4e55bd12f403.slice/crio-df3c3c69b905fac292421b3673ec412c8342d58026016326a50194b9cc2f1d8c WatchSource:0}: Error finding container df3c3c69b905fac292421b3673ec412c8342d58026016326a50194b9cc2f1d8c: Status 404 returned error can't find the container with id df3c3c69b905fac292421b3673ec412c8342d58026016326a50194b9cc2f1d8c Apr 21 10:06:45.745369 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:45.745288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6pxnr" event={"ID":"2fb17919-bd4a-48df-a399-d9c623b57ea9","Type":"ContainerStarted","Data":"4fc50a85d6af8c2b1c795723f798f0912a7925e2eab3c95c214f06919c3b53a5"} Apr 21 10:06:45.745369 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:45.745332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6pxnr" event={"ID":"2fb17919-bd4a-48df-a399-d9c623b57ea9","Type":"ContainerStarted","Data":"f89c854adc3ce8ea55739002c91edf6f16ea6abdee8b0cb20b9ec52a07be8f13"} Apr 21 10:06:45.745369 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:45.745348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6pxnr" event={"ID":"2fb17919-bd4a-48df-a399-d9c623b57ea9","Type":"ContainerStarted","Data":"0af1086fd14595a9eb7cfd94da1a6f615a2c8d482b42e498d2f10c5042dd07e6"} Apr 21 10:06:45.746708 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:45.746683 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" event={"ID":"c491f05b-ff00-4798-a36d-4e55bd12f403","Type":"ContainerStarted","Data":"ace564805ca4f7012ae35444860dc79d9ce8ff262daec02a08eb3278058ba918"} Apr 21 10:06:45.746708 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:45.746709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" event={"ID":"c491f05b-ff00-4798-a36d-4e55bd12f403","Type":"ContainerStarted","Data":"df3c3c69b905fac292421b3673ec412c8342d58026016326a50194b9cc2f1d8c"} Apr 21 10:06:45.746867 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:45.746854 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:06:45.766941 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:45.766902 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" podStartSLOduration=1.7668714140000001 podStartE2EDuration="1.766871414s" podCreationTimestamp="2026-04-21 10:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:45.765327485 +0000 UTC m=+180.150166412" watchObservedRunningTime="2026-04-21 10:06:45.766871414 +0000 UTC m=+180.151710340" Apr 21 10:06:47.251599 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:47.251563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:47.253931 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:47.253902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc8f7796-af2d-4cc6-8807-bb8dcf779711-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5d7nd\" (UID: \"dc8f7796-af2d-4cc6-8807-bb8dcf779711\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:47.533109 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:47.533049 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cwt7g\"" Apr 21 10:06:47.541404 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:47.541386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" Apr 21 10:06:47.657667 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:47.657639 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd"] Apr 21 10:06:47.660387 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:47.660361 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc8f7796_af2d_4cc6_8807_bb8dcf779711.slice/crio-8cd56ab6bed22cf9b0d80b6e2b58c1cd2a2ff58be32468cd0ca10148ec4a4b80 WatchSource:0}: Error finding container 8cd56ab6bed22cf9b0d80b6e2b58c1cd2a2ff58be32468cd0ca10148ec4a4b80: Status 404 returned error can't find the container with id 8cd56ab6bed22cf9b0d80b6e2b58c1cd2a2ff58be32468cd0ca10148ec4a4b80 Apr 21 10:06:47.755349 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:47.755314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" event={"ID":"dc8f7796-af2d-4cc6-8807-bb8dcf779711","Type":"ContainerStarted","Data":"8cd56ab6bed22cf9b0d80b6e2b58c1cd2a2ff58be32468cd0ca10148ec4a4b80"} Apr 21 10:06:47.757027 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:47.757004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6pxnr" event={"ID":"2fb17919-bd4a-48df-a399-d9c623b57ea9","Type":"ContainerStarted","Data":"3ab2c201d6bab49bffdd4e9a8b07c33929b27144a4d9237f01e0db1d2542d9bd"} Apr 21 10:06:47.790826 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:47.790757 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6pxnr" podStartSLOduration=2.000647101 podStartE2EDuration="3.790746541s" podCreationTimestamp="2026-04-21 10:06:44 +0000 UTC" firstStartedPulling="2026-04-21 10:06:44.922544512 +0000 UTC m=+179.307383415" lastFinishedPulling="2026-04-21 10:06:46.712643951 +0000 UTC m=+181.097482855" observedRunningTime="2026-04-21 10:06:47.789506486 +0000 UTC m=+182.174345404" watchObservedRunningTime="2026-04-21 10:06:47.790746541 +0000 UTC m=+182.175585466" Apr 21 10:06:49.764456 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:49.764420 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" event={"ID":"dc8f7796-af2d-4cc6-8807-bb8dcf779711","Type":"ContainerStarted","Data":"3394a064bc7a5b46c88a19c6f7adc009380e8d297a6f0ea2ee95eb56e8291019"} Apr 21 10:06:49.780440 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:49.780399 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5d7nd" podStartSLOduration=33.379423463 podStartE2EDuration="34.780385008s" podCreationTimestamp="2026-04-21 10:06:15 +0000 UTC" firstStartedPulling="2026-04-21 10:06:47.66226461 +0000 UTC m=+182.047103514" lastFinishedPulling="2026-04-21 10:06:49.063226155 +0000 UTC m=+183.448065059" observedRunningTime="2026-04-21 10:06:49.779710098 +0000 UTC m=+184.164549036" watchObservedRunningTime="2026-04-21 10:06:49.780385008 +0000 UTC m=+184.165223933" Apr 21 10:06:52.599732 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.599699 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bc76w"] Apr 21 10:06:52.602621 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.602600 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.605540 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.605515 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 10:06:52.605636 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.605565 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 10:06:52.605705 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.605672 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 10:06:52.606957 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.606939 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-chs74\"" Apr 21 10:06:52.612271 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.612250 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bc76w"] Apr 21 10:06:52.691061 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.691036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t54xh\" (UniqueName: \"kubernetes.io/projected/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-kube-api-access-t54xh\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.691166 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.691086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.691166 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.691151 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.691285 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.691232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.792051 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.792029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.792125 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.792068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.792171 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.792130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.792171 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.792160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t54xh\" (UniqueName: \"kubernetes.io/projected/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-kube-api-access-t54xh\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.792612 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.792591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.794455 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.794436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.794564 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.794545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.800894 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.800851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t54xh\" (UniqueName: \"kubernetes.io/projected/b5ad8623-c5f0-42d4-bcc4-13ca87efed63-kube-api-access-t54xh\") pod \"prometheus-operator-5676c8c784-bc76w\" (UID: \"b5ad8623-c5f0-42d4-bcc4-13ca87efed63\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:52.910911 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:52.910863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" Apr 21 10:06:53.025317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:53.023408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bc76w"] Apr 21 10:06:53.027960 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:53.027935 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5ad8623_c5f0_42d4_bcc4_13ca87efed63.slice/crio-6c65d501eb85d10624df3555b40fa904d16244e218895ba638c7c3072b70a602 WatchSource:0}: Error finding container 6c65d501eb85d10624df3555b40fa904d16244e218895ba638c7c3072b70a602: Status 404 returned error can't find the container with id 6c65d501eb85d10624df3555b40fa904d16244e218895ba638c7c3072b70a602 Apr 21 10:06:53.775524 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:53.775481 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" event={"ID":"b5ad8623-c5f0-42d4-bcc4-13ca87efed63","Type":"ContainerStarted","Data":"6c65d501eb85d10624df3555b40fa904d16244e218895ba638c7c3072b70a602"} Apr 21 10:06:54.779556 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:54.779517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" event={"ID":"b5ad8623-c5f0-42d4-bcc4-13ca87efed63","Type":"ContainerStarted","Data":"8f283e0fe5d6f3a05aabc1f8090f291e0797cea8f98de1b185c4814b1cbf5767"} Apr 21 10:06:54.779556 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:54.779557 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" event={"ID":"b5ad8623-c5f0-42d4-bcc4-13ca87efed63","Type":"ContainerStarted","Data":"4d471049d7a8148914de7c8266fcae4bbbf9c13246c5a32c89ec245691673b52"} Apr 21 10:06:54.796445 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:54.796396 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-bc76w" podStartSLOduration=1.764790132 podStartE2EDuration="2.796382633s" podCreationTimestamp="2026-04-21 10:06:52 +0000 UTC" firstStartedPulling="2026-04-21 10:06:53.02993338 +0000 UTC m=+187.414772288" lastFinishedPulling="2026-04-21 10:06:54.061525873 +0000 UTC m=+188.446364789" observedRunningTime="2026-04-21 10:06:54.794454866 +0000 UTC m=+189.179293794" watchObservedRunningTime="2026-04-21 10:06:54.796382633 +0000 UTC m=+189.181221559" Apr 21 10:06:56.993980 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:56.993953 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr"] Apr 21 10:06:56.996208 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:56.996191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:56.998931 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:56.998911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 10:06:56.999018 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:56.998951 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8bwdk\"" Apr 21 10:06:56.999087 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:56.998948 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 10:06:57.012651 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.012631 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr"] Apr 21 10:06:57.052677 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.052655 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qg446"] Apr 21 10:06:57.054631 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.054616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.057506 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.057487 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 10:06:57.057640 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.057520 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5rh4g\"" Apr 21 10:06:57.057749 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.057731 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 10:06:57.057826 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.057762 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 10:06:57.123506 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.123481 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlssg\" (UniqueName: \"kubernetes.io/projected/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-kube-api-access-jlssg\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.123589 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.123515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.123589 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.123543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.123659 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.123591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.224208 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-accelerators-collector-config\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224294 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-textfile\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224294 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-root\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224374 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpqkn\" (UniqueName: \"kubernetes.io/projected/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-kube-api-access-jpqkn\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224374 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-sys\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224374 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlssg\" (UniqueName: \"kubernetes.io/projected/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-kube-api-access-jlssg\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.224374 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-wtmp\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224512 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.224512 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224420 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.224512 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-tls\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224512 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224512 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-metrics-client-ca\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.224693 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.224513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.225087 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.225064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.226580 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.226558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.226662 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.226565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.232615 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.232596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlssg\" (UniqueName: \"kubernetes.io/projected/2fad54c4-d0f0-4d68-8efb-bf41e5c630ec-kube-api-access-jlssg\") pod \"openshift-state-metrics-9d44df66c-zljwr\" (UID: \"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.304247 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.304195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" Apr 21 10:06:57.324983 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.324957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-sys\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325084 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-sys\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325136 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-wtmp\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325136 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-tls\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325224 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325224 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-metrics-client-ca\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325224 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-accelerators-collector-config\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325363 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-textfile\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325363 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-wtmp\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325363 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-root\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325363 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpqkn\" (UniqueName: \"kubernetes.io/projected/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-kube-api-access-jpqkn\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325363 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-root\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325642 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325618 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-textfile\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.325955 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-accelerators-collector-config\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.326033 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.325972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-metrics-client-ca\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.327734 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.327716 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-tls\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.327824 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.327802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.333270 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.333249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpqkn\" (UniqueName: \"kubernetes.io/projected/a12c9b59-e4c0-4ecf-b7ef-27e880596b38-kube-api-access-jpqkn\") pod \"node-exporter-qg446\" (UID: \"a12c9b59-e4c0-4ecf-b7ef-27e880596b38\") " pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.362946 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.362516 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qg446" Apr 21 10:06:57.370324 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:57.370297 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda12c9b59_e4c0_4ecf_b7ef_27e880596b38.slice/crio-be794095781f2cb0445ebe6848d898ca465a46ec23c84aa39ea7310565602cdf WatchSource:0}: Error finding container be794095781f2cb0445ebe6848d898ca465a46ec23c84aa39ea7310565602cdf: Status 404 returned error can't find the container with id be794095781f2cb0445ebe6848d898ca465a46ec23c84aa39ea7310565602cdf Apr 21 10:06:57.420733 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.419173 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr"] Apr 21 10:06:57.423062 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:06:57.423030 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fad54c4_d0f0_4d68_8efb_bf41e5c630ec.slice/crio-0ca32c8298586eec1eadfa8e443dc935f6b3f690c449a787cf06a3161c95f99f WatchSource:0}: Error finding container 0ca32c8298586eec1eadfa8e443dc935f6b3f690c449a787cf06a3161c95f99f: Status 404 returned error can't find the container with id 0ca32c8298586eec1eadfa8e443dc935f6b3f690c449a787cf06a3161c95f99f Apr 21 10:06:57.788695 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.788649 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" event={"ID":"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec","Type":"ContainerStarted","Data":"74398e6ae9f45e5f434da38ad54b0795873ad62fb09eef2df4620f7f19f7cd41"} Apr 21 10:06:57.788849 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.788701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" event={"ID":"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec","Type":"ContainerStarted","Data":"f28b5526e9a0b895fb20c2428bf170a8c615f7eca17ce955cbf7275332f17794"} Apr 21 10:06:57.788849 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.788716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" event={"ID":"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec","Type":"ContainerStarted","Data":"0ca32c8298586eec1eadfa8e443dc935f6b3f690c449a787cf06a3161c95f99f"} Apr 21 10:06:57.789780 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:57.789753 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qg446" event={"ID":"a12c9b59-e4c0-4ecf-b7ef-27e880596b38","Type":"ContainerStarted","Data":"be794095781f2cb0445ebe6848d898ca465a46ec23c84aa39ea7310565602cdf"} Apr 21 10:06:58.794567 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:58.794535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" event={"ID":"2fad54c4-d0f0-4d68-8efb-bf41e5c630ec","Type":"ContainerStarted","Data":"19f13646276e7f354449b6d69fd8c381f8b9fd78e58b7c5eca4d69987a8d7e99"} Apr 21 10:06:58.795901 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:58.795861 2575 generic.go:358] "Generic (PLEG): container finished" podID="a12c9b59-e4c0-4ecf-b7ef-27e880596b38" containerID="d8b9dc154ff64b5e69871e4c755e31f185d587033b4002cb80e3747f62aace83" exitCode=0 Apr 21 10:06:58.796002 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:58.795950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qg446" event={"ID":"a12c9b59-e4c0-4ecf-b7ef-27e880596b38","Type":"ContainerDied","Data":"d8b9dc154ff64b5e69871e4c755e31f185d587033b4002cb80e3747f62aace83"} Apr 21 10:06:58.813975 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:58.813929 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zljwr" podStartSLOduration=1.917299915 podStartE2EDuration="2.813917673s" podCreationTimestamp="2026-04-21 10:06:56 +0000 UTC" firstStartedPulling="2026-04-21 10:06:57.52572718 +0000 UTC m=+191.910566091" lastFinishedPulling="2026-04-21 10:06:58.422344942 +0000 UTC m=+192.807183849" observedRunningTime="2026-04-21 10:06:58.813164579 +0000 UTC m=+193.198003506" watchObservedRunningTime="2026-04-21 10:06:58.813917673 +0000 UTC m=+193.198756601" Apr 21 10:06:59.801144 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:59.801054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qg446" event={"ID":"a12c9b59-e4c0-4ecf-b7ef-27e880596b38","Type":"ContainerStarted","Data":"8e1fc9bbcf309d126c5f4f7f20d1a0dd75352008c3a431fc9c2949bb61ae4a88"} Apr 21 10:06:59.801144 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:59.801090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qg446" event={"ID":"a12c9b59-e4c0-4ecf-b7ef-27e880596b38","Type":"ContainerStarted","Data":"9ef69e92b6eaa8792ab94062aca44093e3c408f58a3d7e27e689b34e8c1a8256"} Apr 21 10:06:59.835270 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:06:59.835226 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qg446" podStartSLOduration=2.085920854 podStartE2EDuration="2.835212918s" podCreationTimestamp="2026-04-21 10:06:57 +0000 UTC" firstStartedPulling="2026-04-21 10:06:57.372055444 +0000 UTC m=+191.756894355" lastFinishedPulling="2026-04-21 10:06:58.121347506 +0000 UTC m=+192.506186419" observedRunningTime="2026-04-21 10:06:59.832535833 +0000 UTC m=+194.217374758" watchObservedRunningTime="2026-04-21 10:06:59.835212918 +0000 UTC m=+194.220051844" Apr 21 10:07:06.753525 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:06.753497 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-779dbdc744-bx9cs" Apr 21 10:07:33.886096 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:33.886065 2575 generic.go:358] "Generic (PLEG): container finished" podID="95ed4025-e077-4170-b387-681d37c22925" containerID="4dca6e9d42ae6544ca0a8c2777b11fa3a4b5fa0505b2beb5a4d84cca38c85fb9" exitCode=0 Apr 21 10:07:33.886405 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:33.886137 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gzcbd" event={"ID":"95ed4025-e077-4170-b387-681d37c22925","Type":"ContainerDied","Data":"4dca6e9d42ae6544ca0a8c2777b11fa3a4b5fa0505b2beb5a4d84cca38c85fb9"} Apr 21 10:07:33.886443 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:33.886423 2575 scope.go:117] "RemoveContainer" containerID="4dca6e9d42ae6544ca0a8c2777b11fa3a4b5fa0505b2beb5a4d84cca38c85fb9" Apr 21 10:07:34.890210 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:34.890181 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gzcbd" event={"ID":"95ed4025-e077-4170-b387-681d37c22925","Type":"ContainerStarted","Data":"ac2735e2a1a00b4cc42d71a5b55bc5dc889727edf1e4621f5d979e2d73e7ea32"} Apr 21 10:07:34.997338 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:34.997312 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9nzkd_15e1aa47-50b2-4765-a870-9c646ae4fb01/dns-node-resolver/0.log" Apr 21 10:07:56.952084 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:56.952013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:07:56.954253 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:56.954228 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1100ece6-afda-453b-8595-490c94dbb90d-metrics-certs\") pod \"network-metrics-daemon-z7xfz\" (UID: \"1100ece6-afda-453b-8595-490c94dbb90d\") " pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:07:57.221870 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:57.221791 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfhd5\"" Apr 21 10:07:57.229844 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:57.229827 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z7xfz" Apr 21 10:07:57.341847 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:57.341752 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z7xfz"] Apr 21 10:07:57.344214 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:07:57.344188 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1100ece6_afda_453b_8595_490c94dbb90d.slice/crio-bd7d11ebb3b69cdc62d204a207d4b9c7c8416d7b7b3a442dc947c9d3153df756 WatchSource:0}: Error finding container bd7d11ebb3b69cdc62d204a207d4b9c7c8416d7b7b3a442dc947c9d3153df756: Status 404 returned error can't find the container with id bd7d11ebb3b69cdc62d204a207d4b9c7c8416d7b7b3a442dc947c9d3153df756 Apr 21 10:07:57.948649 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:57.948609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z7xfz" event={"ID":"1100ece6-afda-453b-8595-490c94dbb90d","Type":"ContainerStarted","Data":"bd7d11ebb3b69cdc62d204a207d4b9c7c8416d7b7b3a442dc947c9d3153df756"} Apr 21 10:07:58.953089 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:58.953048 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z7xfz" event={"ID":"1100ece6-afda-453b-8595-490c94dbb90d","Type":"ContainerStarted","Data":"7dff3d0a4701e7c74d18af91a4f9b6964271884746a40a58b52bfb965fb6ee34"} Apr 21 10:07:58.953089 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:58.953087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z7xfz" event={"ID":"1100ece6-afda-453b-8595-490c94dbb90d","Type":"ContainerStarted","Data":"d03aad85268ed6766af8396e4886f75337b8c79b3ef97b775dc7aa8b3a1f12ce"} Apr 21 10:07:58.970102 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:07:58.970061 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z7xfz" podStartSLOduration=252.01143806 podStartE2EDuration="4m12.970047528s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:07:57.346049139 +0000 UTC m=+251.730888042" lastFinishedPulling="2026-04-21 10:07:58.304658603 +0000 UTC m=+252.689497510" observedRunningTime="2026-04-21 10:07:58.968459757 +0000 UTC m=+253.353298695" watchObservedRunningTime="2026-04-21 10:07:58.970047528 +0000 UTC m=+253.354886454" Apr 21 10:08:23.684795 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:08:23.684742 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-968lb" podUID="1532c4fc-8a96-448b-954a-32eca5cac710" Apr 21 10:08:23.684795 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:08:23.684751 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pwrsm" podUID="4004e34e-ae2c-4815-bea5-806b5e15036f" Apr 21 10:08:24.016628 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:24.016554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:08:24.016628 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:24.016619 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pwrsm" Apr 21 10:08:27.478587 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.478545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:08:27.479005 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.478607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:08:27.480969 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.480940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4004e34e-ae2c-4815-bea5-806b5e15036f-metrics-tls\") pod \"dns-default-pwrsm\" (UID: \"4004e34e-ae2c-4815-bea5-806b5e15036f\") " pod="openshift-dns/dns-default-pwrsm" Apr 21 10:08:27.481077 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.481012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1532c4fc-8a96-448b-954a-32eca5cac710-cert\") pod \"ingress-canary-968lb\" (UID: \"1532c4fc-8a96-448b-954a-32eca5cac710\") " pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:08:27.620425 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.620391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2fp54\"" Apr 21 10:08:27.621223 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.621203 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nqzpr\"" Apr 21 10:08:27.628373 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.628356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pwrsm" Apr 21 10:08:27.628457 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.628444 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-968lb" Apr 21 10:08:27.758817 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.758788 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-968lb"] Apr 21 10:08:27.761204 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:08:27.761173 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1532c4fc_8a96_448b_954a_32eca5cac710.slice/crio-826ee4ffd7596f139221d641ea48cde2591f590e02c5e3b6a4fc1b1f21542e47 WatchSource:0}: Error finding container 826ee4ffd7596f139221d641ea48cde2591f590e02c5e3b6a4fc1b1f21542e47: Status 404 returned error can't find the container with id 826ee4ffd7596f139221d641ea48cde2591f590e02c5e3b6a4fc1b1f21542e47 Apr 21 10:08:27.774049 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:27.774025 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pwrsm"] Apr 21 10:08:27.776891 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:08:27.776853 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4004e34e_ae2c_4815_bea5_806b5e15036f.slice/crio-c5b48fdabc3b2879bda25b76ffba106581a93e0396674020d954c622d3d70f37 WatchSource:0}: Error finding container c5b48fdabc3b2879bda25b76ffba106581a93e0396674020d954c622d3d70f37: Status 404 returned error can't find the container with id c5b48fdabc3b2879bda25b76ffba106581a93e0396674020d954c622d3d70f37 Apr 21 10:08:28.027903 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:28.027802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pwrsm" event={"ID":"4004e34e-ae2c-4815-bea5-806b5e15036f","Type":"ContainerStarted","Data":"c5b48fdabc3b2879bda25b76ffba106581a93e0396674020d954c622d3d70f37"} Apr 21 10:08:28.028693 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:28.028662 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-968lb" event={"ID":"1532c4fc-8a96-448b-954a-32eca5cac710","Type":"ContainerStarted","Data":"826ee4ffd7596f139221d641ea48cde2591f590e02c5e3b6a4fc1b1f21542e47"} Apr 21 10:08:30.037703 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:30.037664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pwrsm" event={"ID":"4004e34e-ae2c-4815-bea5-806b5e15036f","Type":"ContainerStarted","Data":"bd332e2db0adbb380bbd1131f8be2997055f49d91635f512afc6b6d58a001004"} Apr 21 10:08:30.038109 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:30.037712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pwrsm" event={"ID":"4004e34e-ae2c-4815-bea5-806b5e15036f","Type":"ContainerStarted","Data":"e01ac8e2d4896e3c50a23e3ab5ed4b85a53317d3c08e3a08ad79d3cccbd060eb"} Apr 21 10:08:30.038109 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:30.037785 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pwrsm" Apr 21 10:08:30.039055 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:30.039035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-968lb" event={"ID":"1532c4fc-8a96-448b-954a-32eca5cac710","Type":"ContainerStarted","Data":"7a10491e46beb1d295e54cc2f9b0b0e298b1c4ba4e25208efc284dbb1bd1e9a3"} Apr 21 10:08:30.055732 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:30.055692 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pwrsm" podStartSLOduration=251.183477271 podStartE2EDuration="4m13.055682633s" podCreationTimestamp="2026-04-21 10:04:17 +0000 UTC" firstStartedPulling="2026-04-21 10:08:27.778427075 +0000 UTC m=+282.163265982" lastFinishedPulling="2026-04-21 10:08:29.650632426 +0000 UTC m=+284.035471344" observedRunningTime="2026-04-21 10:08:30.054038973 +0000 UTC m=+284.438877899" watchObservedRunningTime="2026-04-21 10:08:30.055682633 +0000 UTC m=+284.440521560" Apr 21 10:08:30.070441 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:30.070388 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-968lb" podStartSLOduration=251.180688102 podStartE2EDuration="4m13.070368874s" podCreationTimestamp="2026-04-21 10:04:17 +0000 UTC" firstStartedPulling="2026-04-21 10:08:27.763031571 +0000 UTC m=+282.147870478" lastFinishedPulling="2026-04-21 10:08:29.652712342 +0000 UTC m=+284.037551250" observedRunningTime="2026-04-21 10:08:30.068463997 +0000 UTC m=+284.453302925" watchObservedRunningTime="2026-04-21 10:08:30.070368874 +0000 UTC m=+284.455207800" Apr 21 10:08:40.048127 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:40.048088 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pwrsm" Apr 21 10:08:46.114562 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:46.114521 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:08:46.115078 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:46.114706 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:08:46.121021 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:08:46.120997 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 10:10:41.948155 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.948124 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m"] Apr 21 10:10:41.951094 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.951076 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:41.953543 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.953523 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 10:10:41.953655 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.953581 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 10:10:41.954695 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.954677 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 10:10:41.954695 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.954684 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 10:10:41.959216 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.959197 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m"] Apr 21 10:10:41.969709 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.969688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-klusterlet-config\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:41.969804 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.969730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-tmp\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:41.969848 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:41.969830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnhm\" (UniqueName: \"kubernetes.io/projected/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-kube-api-access-tnnhm\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:42.070375 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.070351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-tmp\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:42.070471 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.070386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnhm\" (UniqueName: \"kubernetes.io/projected/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-kube-api-access-tnnhm\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:42.070471 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.070418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-klusterlet-config\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:42.070764 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.070749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-tmp\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:42.072801 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.072774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-klusterlet-config\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:42.078989 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.078969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnhm\" (UniqueName: \"kubernetes.io/projected/60d651ac-65ed-4ab9-a4b8-60ffe909d14e-kube-api-access-tnnhm\") pod \"klusterlet-addon-workmgr-6989898df9-hsf9m\" (UID: \"60d651ac-65ed-4ab9-a4b8-60ffe909d14e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:42.261283 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.261227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:42.374183 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.374161 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m"] Apr 21 10:10:42.376676 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:10:42.376647 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60d651ac_65ed_4ab9_a4b8_60ffe909d14e.slice/crio-b1868cd73adb8f2252453bd9bd58af57d5b7057b8406f2657ef47611b010e96b WatchSource:0}: Error finding container b1868cd73adb8f2252453bd9bd58af57d5b7057b8406f2657ef47611b010e96b: Status 404 returned error can't find the container with id b1868cd73adb8f2252453bd9bd58af57d5b7057b8406f2657ef47611b010e96b Apr 21 10:10:42.378208 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.378184 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:10:42.401126 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:42.401097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" event={"ID":"60d651ac-65ed-4ab9-a4b8-60ffe909d14e","Type":"ContainerStarted","Data":"b1868cd73adb8f2252453bd9bd58af57d5b7057b8406f2657ef47611b010e96b"} Apr 21 10:10:46.413747 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:46.413708 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" event={"ID":"60d651ac-65ed-4ab9-a4b8-60ffe909d14e","Type":"ContainerStarted","Data":"cd1680b833155b6c397b2fd8990f4b362010a24a81cac4a2b1b33526b6c45e80"} Apr 21 10:10:46.414142 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:46.413909 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:46.415598 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:46.415578 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" Apr 21 10:10:46.430402 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:46.430360 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6989898df9-hsf9m" podStartSLOduration=1.940823465 podStartE2EDuration="5.430349207s" podCreationTimestamp="2026-04-21 10:10:41 +0000 UTC" firstStartedPulling="2026-04-21 10:10:42.37832566 +0000 UTC m=+416.763164564" lastFinishedPulling="2026-04-21 10:10:45.867851396 +0000 UTC m=+420.252690306" observedRunningTime="2026-04-21 10:10:46.429618691 +0000 UTC m=+420.814457617" watchObservedRunningTime="2026-04-21 10:10:46.430349207 +0000 UTC m=+420.815188133" Apr 21 10:10:52.761998 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.761953 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8"] Apr 21 10:10:52.767139 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.767105 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.769984 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.769953 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 10:10:52.770122 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.769962 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 10:10:52.771393 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.771369 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q44hx\"" Apr 21 10:10:52.773232 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.773208 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8"] Apr 21 10:10:52.838256 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.838229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.838364 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.838267 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.838364 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.838335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td9nc\" (UniqueName: \"kubernetes.io/projected/3c996fde-088d-4a53-9d69-ef078cd126f9-kube-api-access-td9nc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.939363 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.939338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.939455 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.939374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.939455 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.939404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td9nc\" (UniqueName: \"kubernetes.io/projected/3c996fde-088d-4a53-9d69-ef078cd126f9-kube-api-access-td9nc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.939703 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.939683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.939743 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.939716 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:52.949647 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:52.949614 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td9nc\" (UniqueName: \"kubernetes.io/projected/3c996fde-088d-4a53-9d69-ef078cd126f9-kube-api-access-td9nc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:53.077292 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:53.077242 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:10:53.211185 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:53.211158 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8"] Apr 21 10:10:53.214093 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:10:53.214069 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c996fde_088d_4a53_9d69_ef078cd126f9.slice/crio-06e8035ef5614026a878dab072cf4a2b90d4ed02b80a7fcb89afbf7b84de599a WatchSource:0}: Error finding container 06e8035ef5614026a878dab072cf4a2b90d4ed02b80a7fcb89afbf7b84de599a: Status 404 returned error can't find the container with id 06e8035ef5614026a878dab072cf4a2b90d4ed02b80a7fcb89afbf7b84de599a Apr 21 10:10:53.434978 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:53.434946 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" event={"ID":"3c996fde-088d-4a53-9d69-ef078cd126f9","Type":"ContainerStarted","Data":"06e8035ef5614026a878dab072cf4a2b90d4ed02b80a7fcb89afbf7b84de599a"} Apr 21 10:10:58.451036 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:58.450977 2575 generic.go:358] "Generic (PLEG): container finished" podID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerID="b553f52290d50861a69222c1f47fb31c02541fff4882cdc00e024ff1b91e27e2" exitCode=0 Apr 21 10:10:58.451381 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:10:58.451073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" event={"ID":"3c996fde-088d-4a53-9d69-ef078cd126f9","Type":"ContainerDied","Data":"b553f52290d50861a69222c1f47fb31c02541fff4882cdc00e024ff1b91e27e2"} Apr 21 10:11:01.462289 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:01.462255 2575 generic.go:358] "Generic (PLEG): container finished" podID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerID="fe9191aa44f88be6ad54dd5bed245488e29fdf8d8563d196c392972883d9582f" exitCode=0 Apr 21 10:11:01.462728 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:01.462294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" event={"ID":"3c996fde-088d-4a53-9d69-ef078cd126f9","Type":"ContainerDied","Data":"fe9191aa44f88be6ad54dd5bed245488e29fdf8d8563d196c392972883d9582f"} Apr 21 10:11:08.484418 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:08.484378 2575 generic.go:358] "Generic (PLEG): container finished" podID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerID="d3cbb92d80b2b9c3a0be70cc6e1adc3a6c5e4502c4461ae4255fe5d07ca38fbb" exitCode=0 Apr 21 10:11:08.484786 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:08.484475 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" event={"ID":"3c996fde-088d-4a53-9d69-ef078cd126f9","Type":"ContainerDied","Data":"d3cbb92d80b2b9c3a0be70cc6e1adc3a6c5e4502c4461ae4255fe5d07ca38fbb"} Apr 21 10:11:09.608137 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.608113 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:11:09.657146 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.657125 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-util\") pod \"3c996fde-088d-4a53-9d69-ef078cd126f9\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " Apr 21 10:11:09.657253 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.657157 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-bundle\") pod \"3c996fde-088d-4a53-9d69-ef078cd126f9\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " Apr 21 10:11:09.657253 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.657183 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td9nc\" (UniqueName: \"kubernetes.io/projected/3c996fde-088d-4a53-9d69-ef078cd126f9-kube-api-access-td9nc\") pod \"3c996fde-088d-4a53-9d69-ef078cd126f9\" (UID: \"3c996fde-088d-4a53-9d69-ef078cd126f9\") " Apr 21 10:11:09.657652 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.657624 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-bundle" (OuterVolumeSpecName: "bundle") pod "3c996fde-088d-4a53-9d69-ef078cd126f9" (UID: "3c996fde-088d-4a53-9d69-ef078cd126f9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:11:09.659317 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.659293 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c996fde-088d-4a53-9d69-ef078cd126f9-kube-api-access-td9nc" (OuterVolumeSpecName: "kube-api-access-td9nc") pod "3c996fde-088d-4a53-9d69-ef078cd126f9" (UID: "3c996fde-088d-4a53-9d69-ef078cd126f9"). InnerVolumeSpecName "kube-api-access-td9nc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:11:09.661131 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.661107 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-util" (OuterVolumeSpecName: "util") pod "3c996fde-088d-4a53-9d69-ef078cd126f9" (UID: "3c996fde-088d-4a53-9d69-ef078cd126f9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:11:09.757986 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.757935 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-util\") on node \"ip-10-0-137-205.ec2.internal\" DevicePath \"\"" Apr 21 10:11:09.757986 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.757955 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c996fde-088d-4a53-9d69-ef078cd126f9-bundle\") on node \"ip-10-0-137-205.ec2.internal\" DevicePath \"\"" Apr 21 10:11:09.757986 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:09.757966 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-td9nc\" (UniqueName: \"kubernetes.io/projected/3c996fde-088d-4a53-9d69-ef078cd126f9-kube-api-access-td9nc\") on node \"ip-10-0-137-205.ec2.internal\" DevicePath \"\"" Apr 21 10:11:10.491812 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:10.491749 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" event={"ID":"3c996fde-088d-4a53-9d69-ef078cd126f9","Type":"ContainerDied","Data":"06e8035ef5614026a878dab072cf4a2b90d4ed02b80a7fcb89afbf7b84de599a"} Apr 21 10:11:10.491812 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:10.491789 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06e8035ef5614026a878dab072cf4a2b90d4ed02b80a7fcb89afbf7b84de599a" Apr 21 10:11:10.491812 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:10.491800 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ch7hw8" Apr 21 10:11:14.537491 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.537455 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg"] Apr 21 10:11:14.537962 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.537734 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerName="pull" Apr 21 10:11:14.537962 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.537745 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerName="pull" Apr 21 10:11:14.537962 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.537754 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerName="util" Apr 21 10:11:14.537962 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.537760 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerName="util" Apr 21 10:11:14.537962 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.537766 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerName="extract" Apr 21 10:11:14.537962 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.537772 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerName="extract" Apr 21 10:11:14.537962 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.537817 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c996fde-088d-4a53-9d69-ef078cd126f9" containerName="extract" Apr 21 10:11:14.575346 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.575315 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg"] Apr 21 10:11:14.575478 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.575426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:14.579809 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.579784 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 21 10:11:14.580079 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.580054 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 10:11:14.580079 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.580082 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-mst78\"" Apr 21 10:11:14.581079 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.581057 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 10:11:14.692216 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.692188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmslf\" (UniqueName: \"kubernetes.io/projected/3050790c-5371-470d-9f6f-cc96f762a854-kube-api-access-fmslf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg\" (UID: \"3050790c-5371-470d-9f6f-cc96f762a854\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:14.692310 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.692226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3050790c-5371-470d-9f6f-cc96f762a854-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg\" (UID: \"3050790c-5371-470d-9f6f-cc96f762a854\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:14.792647 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.792593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmslf\" (UniqueName: \"kubernetes.io/projected/3050790c-5371-470d-9f6f-cc96f762a854-kube-api-access-fmslf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg\" (UID: \"3050790c-5371-470d-9f6f-cc96f762a854\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:14.792647 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.792626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3050790c-5371-470d-9f6f-cc96f762a854-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg\" (UID: \"3050790c-5371-470d-9f6f-cc96f762a854\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:14.794815 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.794793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3050790c-5371-470d-9f6f-cc96f762a854-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg\" (UID: \"3050790c-5371-470d-9f6f-cc96f762a854\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:14.802815 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.802791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmslf\" (UniqueName: \"kubernetes.io/projected/3050790c-5371-470d-9f6f-cc96f762a854-kube-api-access-fmslf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg\" (UID: \"3050790c-5371-470d-9f6f-cc96f762a854\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:14.886173 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:14.886150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:15.009814 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:15.009786 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg"] Apr 21 10:11:15.013365 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:11:15.013334 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3050790c_5371_470d_9f6f_cc96f762a854.slice/crio-1b264a8208ef5aea6e659d6f284f3ae38f2ea76bbc620f329ecfad19b390aef6 WatchSource:0}: Error finding container 1b264a8208ef5aea6e659d6f284f3ae38f2ea76bbc620f329ecfad19b390aef6: Status 404 returned error can't find the container with id 1b264a8208ef5aea6e659d6f284f3ae38f2ea76bbc620f329ecfad19b390aef6 Apr 21 10:11:15.505869 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:15.505842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" event={"ID":"3050790c-5371-470d-9f6f-cc96f762a854","Type":"ContainerStarted","Data":"1b264a8208ef5aea6e659d6f284f3ae38f2ea76bbc620f329ecfad19b390aef6"} Apr 21 10:11:19.122983 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.122955 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gpbs9"] Apr 21 10:11:19.147968 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.147945 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gpbs9"] Apr 21 10:11:19.148091 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.148065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.150779 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.150755 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 21 10:11:19.150779 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.150767 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-7m985\"" Apr 21 10:11:19.150968 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.150813 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 10:11:19.224294 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.223901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzs5\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-kube-api-access-6fzs5\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.224294 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.223968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.224294 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.223996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f9c469d6-8aff-4db3-b677-2f115a00ad56-cabundle0\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.325165 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.325143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzs5\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-kube-api-access-6fzs5\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.325267 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.325202 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.325267 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.325230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f9c469d6-8aff-4db3-b677-2f115a00ad56-cabundle0\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.325355 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.325300 2575 secret.go:281] references non-existent secret key: ca.crt Apr 21 10:11:19.325355 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.325316 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 10:11:19.325355 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.325325 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gpbs9: references non-existent secret key: ca.crt Apr 21 10:11:19.325497 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.325367 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates podName:f9c469d6-8aff-4db3-b677-2f115a00ad56 nodeName:}" failed. No retries permitted until 2026-04-21 10:11:19.825352183 +0000 UTC m=+454.210191087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates") pod "keda-operator-ffbb595cb-gpbs9" (UID: "f9c469d6-8aff-4db3-b677-2f115a00ad56") : references non-existent secret key: ca.crt Apr 21 10:11:19.325846 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.325827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f9c469d6-8aff-4db3-b677-2f115a00ad56-cabundle0\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.335813 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.335791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzs5\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-kube-api-access-6fzs5\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.391838 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.391753 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q"] Apr 21 10:11:19.412506 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.412485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q"] Apr 21 10:11:19.412615 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.412596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.415010 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.414991 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 21 10:11:19.518766 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.518740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" event={"ID":"3050790c-5371-470d-9f6f-cc96f762a854","Type":"ContainerStarted","Data":"ac61b6c6723743fb85dda194ec0a2bd8ef61ea6a5c7fa269323da08514e916c3"} Apr 21 10:11:19.518918 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.518900 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:19.527230 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.527210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13616c9e-3a87-4f8f-9430-7660bec8e4b3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.527318 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.527258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.527361 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.527336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxn4\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-kube-api-access-bhxn4\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.536998 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.536951 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" podStartSLOduration=1.969204306 podStartE2EDuration="5.536936917s" podCreationTimestamp="2026-04-21 10:11:14 +0000 UTC" firstStartedPulling="2026-04-21 10:11:15.015252534 +0000 UTC m=+449.400091439" lastFinishedPulling="2026-04-21 10:11:18.582985134 +0000 UTC m=+452.967824050" observedRunningTime="2026-04-21 10:11:19.536057193 +0000 UTC m=+453.920896124" watchObservedRunningTime="2026-04-21 10:11:19.536936917 +0000 UTC m=+453.921775841" Apr 21 10:11:19.628348 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.628328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.628454 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.628383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxn4\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-kube-api-access-bhxn4\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.628518 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.628482 2575 secret.go:281] references non-existent secret key: tls.crt Apr 21 10:11:19.628518 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.628505 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 10:11:19.628634 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.628527 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q: references non-existent secret key: tls.crt Apr 21 10:11:19.628634 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.628571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13616c9e-3a87-4f8f-9430-7660bec8e4b3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.628634 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.628599 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates podName:13616c9e-3a87-4f8f-9430-7660bec8e4b3 nodeName:}" failed. No retries permitted until 2026-04-21 10:11:20.128572069 +0000 UTC m=+454.513410979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates") pod "keda-metrics-apiserver-7c9f485588-nxc9q" (UID: "13616c9e-3a87-4f8f-9430-7660bec8e4b3") : references non-existent secret key: tls.crt Apr 21 10:11:19.628934 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.628915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13616c9e-3a87-4f8f-9430-7660bec8e4b3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.637979 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.637954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxn4\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-kube-api-access-bhxn4\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:19.677407 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.677384 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-pzcfj"] Apr 21 10:11:19.694237 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.694217 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-pzcfj"] Apr 21 10:11:19.694342 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.694328 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:19.696894 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.696861 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 21 10:11:19.729719 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.729694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b3b839d9-fcb8-4392-91ab-a82e2b11c339-certificates\") pod \"keda-admission-cf49989db-pzcfj\" (UID: \"b3b839d9-fcb8-4392-91ab-a82e2b11c339\") " pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:19.729827 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.729757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhq9\" (UniqueName: \"kubernetes.io/projected/b3b839d9-fcb8-4392-91ab-a82e2b11c339-kube-api-access-bjhq9\") pod \"keda-admission-cf49989db-pzcfj\" (UID: \"b3b839d9-fcb8-4392-91ab-a82e2b11c339\") " pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:19.831187 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.831154 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:19.831359 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.831205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b3b839d9-fcb8-4392-91ab-a82e2b11c339-certificates\") pod \"keda-admission-cf49989db-pzcfj\" (UID: \"b3b839d9-fcb8-4392-91ab-a82e2b11c339\") " pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:19.831359 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.831258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhq9\" (UniqueName: \"kubernetes.io/projected/b3b839d9-fcb8-4392-91ab-a82e2b11c339-kube-api-access-bjhq9\") pod \"keda-admission-cf49989db-pzcfj\" (UID: \"b3b839d9-fcb8-4392-91ab-a82e2b11c339\") " pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:19.831359 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.831347 2575 secret.go:281] references non-existent secret key: ca.crt Apr 21 10:11:19.831522 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.831369 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 10:11:19.831522 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.831381 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gpbs9: references non-existent secret key: ca.crt Apr 21 10:11:19.831522 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:19.831441 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates podName:f9c469d6-8aff-4db3-b677-2f115a00ad56 nodeName:}" failed. No retries permitted until 2026-04-21 10:11:20.831420939 +0000 UTC m=+455.216259846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates") pod "keda-operator-ffbb595cb-gpbs9" (UID: "f9c469d6-8aff-4db3-b677-2f115a00ad56") : references non-existent secret key: ca.crt Apr 21 10:11:19.834061 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.834021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b3b839d9-fcb8-4392-91ab-a82e2b11c339-certificates\") pod \"keda-admission-cf49989db-pzcfj\" (UID: \"b3b839d9-fcb8-4392-91ab-a82e2b11c339\") " pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:19.841125 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:19.841099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhq9\" (UniqueName: \"kubernetes.io/projected/b3b839d9-fcb8-4392-91ab-a82e2b11c339-kube-api-access-bjhq9\") pod \"keda-admission-cf49989db-pzcfj\" (UID: \"b3b839d9-fcb8-4392-91ab-a82e2b11c339\") " pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:20.004935 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:20.004842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:20.134659 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:20.134547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:20.135080 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:20.134747 2575 secret.go:281] references non-existent secret key: tls.crt Apr 21 10:11:20.135080 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:20.134763 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 10:11:20.135080 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:20.134785 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q: references non-existent secret key: tls.crt Apr 21 10:11:20.135080 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:20.134839 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates podName:13616c9e-3a87-4f8f-9430-7660bec8e4b3 nodeName:}" failed. No retries permitted until 2026-04-21 10:11:21.134822465 +0000 UTC m=+455.519661389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates") pod "keda-metrics-apiserver-7c9f485588-nxc9q" (UID: "13616c9e-3a87-4f8f-9430-7660bec8e4b3") : references non-existent secret key: tls.crt Apr 21 10:11:20.136733 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:20.136708 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-pzcfj"] Apr 21 10:11:20.140695 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:11:20.140664 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3b839d9_fcb8_4392_91ab_a82e2b11c339.slice/crio-89c5869af42b2536eda867068ee7a4577f3f70a0721e1f5149c1f47fa8a813f5 WatchSource:0}: Error finding container 89c5869af42b2536eda867068ee7a4577f3f70a0721e1f5149c1f47fa8a813f5: Status 404 returned error can't find the container with id 89c5869af42b2536eda867068ee7a4577f3f70a0721e1f5149c1f47fa8a813f5 Apr 21 10:11:20.522610 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:20.522566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-pzcfj" event={"ID":"b3b839d9-fcb8-4392-91ab-a82e2b11c339","Type":"ContainerStarted","Data":"89c5869af42b2536eda867068ee7a4577f3f70a0721e1f5149c1f47fa8a813f5"} Apr 21 10:11:20.845902 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:20.845810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:20.846051 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:20.845958 2575 secret.go:281] references non-existent secret key: ca.crt Apr 21 10:11:20.846051 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:20.845977 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 10:11:20.846051 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:20.845986 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gpbs9: references non-existent secret key: ca.crt Apr 21 10:11:20.846051 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:20.846039 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates podName:f9c469d6-8aff-4db3-b677-2f115a00ad56 nodeName:}" failed. No retries permitted until 2026-04-21 10:11:22.846024657 +0000 UTC m=+457.230863561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates") pod "keda-operator-ffbb595cb-gpbs9" (UID: "f9c469d6-8aff-4db3-b677-2f115a00ad56") : references non-existent secret key: ca.crt Apr 21 10:11:21.148075 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:21.147985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:21.148539 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:21.148151 2575 secret.go:281] references non-existent secret key: tls.crt Apr 21 10:11:21.148539 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:21.148173 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 10:11:21.148539 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:21.148202 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q: references non-existent secret key: tls.crt Apr 21 10:11:21.148539 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:21.148274 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates podName:13616c9e-3a87-4f8f-9430-7660bec8e4b3 nodeName:}" failed. No retries permitted until 2026-04-21 10:11:23.148254077 +0000 UTC m=+457.533092997 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates") pod "keda-metrics-apiserver-7c9f485588-nxc9q" (UID: "13616c9e-3a87-4f8f-9430-7660bec8e4b3") : references non-existent secret key: tls.crt Apr 21 10:11:22.529828 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:22.529784 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-pzcfj" event={"ID":"b3b839d9-fcb8-4392-91ab-a82e2b11c339","Type":"ContainerStarted","Data":"347e8f194fd5fee25cd20434d03805f0b58a3eff3afbe1ad87a1fb786e1818ef"} Apr 21 10:11:22.530200 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:22.529919 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:22.546786 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:22.546736 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-pzcfj" podStartSLOduration=1.349561658 podStartE2EDuration="3.546719032s" podCreationTimestamp="2026-04-21 10:11:19 +0000 UTC" firstStartedPulling="2026-04-21 10:11:20.143021904 +0000 UTC m=+454.527860814" lastFinishedPulling="2026-04-21 10:11:22.340179281 +0000 UTC m=+456.725018188" observedRunningTime="2026-04-21 10:11:22.546387741 +0000 UTC m=+456.931226670" watchObservedRunningTime="2026-04-21 10:11:22.546719032 +0000 UTC m=+456.931557960" Apr 21 10:11:22.862253 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:22.862230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:22.862397 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:22.862369 2575 secret.go:281] references non-existent secret key: ca.crt Apr 21 10:11:22.862397 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:22.862386 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 10:11:22.862397 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:22.862395 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gpbs9: references non-existent secret key: ca.crt Apr 21 10:11:22.862497 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:22.862439 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates podName:f9c469d6-8aff-4db3-b677-2f115a00ad56 nodeName:}" failed. No retries permitted until 2026-04-21 10:11:26.862425203 +0000 UTC m=+461.247264108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates") pod "keda-operator-ffbb595cb-gpbs9" (UID: "f9c469d6-8aff-4db3-b677-2f115a00ad56") : references non-existent secret key: ca.crt Apr 21 10:11:23.164314 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:23.164261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:23.164405 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:23.164388 2575 secret.go:281] references non-existent secret key: tls.crt Apr 21 10:11:23.164405 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:23.164402 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 10:11:23.164475 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:23.164417 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q: references non-existent secret key: tls.crt Apr 21 10:11:23.164475 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:11:23.164463 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates podName:13616c9e-3a87-4f8f-9430-7660bec8e4b3 nodeName:}" failed. No retries permitted until 2026-04-21 10:11:27.164447093 +0000 UTC m=+461.549285997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates") pod "keda-metrics-apiserver-7c9f485588-nxc9q" (UID: "13616c9e-3a87-4f8f-9430-7660bec8e4b3") : references non-existent secret key: tls.crt Apr 21 10:11:26.893488 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:26.893452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:26.896020 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:26.895999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f9c469d6-8aff-4db3-b677-2f115a00ad56-certificates\") pod \"keda-operator-ffbb595cb-gpbs9\" (UID: \"f9c469d6-8aff-4db3-b677-2f115a00ad56\") " pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:26.957508 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:26.957474 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:27.094102 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:27.094072 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gpbs9"] Apr 21 10:11:27.097441 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:11:27.097415 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c469d6_8aff_4db3_b677_2f115a00ad56.slice/crio-a556c9afbc40f91f1278e489d462cc2565ba889d83ec6234a1fba31651cec1ec WatchSource:0}: Error finding container a556c9afbc40f91f1278e489d462cc2565ba889d83ec6234a1fba31651cec1ec: Status 404 returned error can't find the container with id a556c9afbc40f91f1278e489d462cc2565ba889d83ec6234a1fba31651cec1ec Apr 21 10:11:27.196847 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:27.196824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:27.199055 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:27.199037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13616c9e-3a87-4f8f-9430-7660bec8e4b3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nxc9q\" (UID: \"13616c9e-3a87-4f8f-9430-7660bec8e4b3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:27.222974 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:27.222954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:27.335574 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:27.335553 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q"] Apr 21 10:11:27.337642 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:11:27.337616 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13616c9e_3a87_4f8f_9430_7660bec8e4b3.slice/crio-79e9cb5a167167b37ed13ba8031306ff953fc0d62c11bc7f1c922635949d5523 WatchSource:0}: Error finding container 79e9cb5a167167b37ed13ba8031306ff953fc0d62c11bc7f1c922635949d5523: Status 404 returned error can't find the container with id 79e9cb5a167167b37ed13ba8031306ff953fc0d62c11bc7f1c922635949d5523 Apr 21 10:11:27.544645 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:27.544576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" event={"ID":"13616c9e-3a87-4f8f-9430-7660bec8e4b3","Type":"ContainerStarted","Data":"79e9cb5a167167b37ed13ba8031306ff953fc0d62c11bc7f1c922635949d5523"} Apr 21 10:11:27.545671 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:27.545645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" event={"ID":"f9c469d6-8aff-4db3-b677-2f115a00ad56","Type":"ContainerStarted","Data":"a556c9afbc40f91f1278e489d462cc2565ba889d83ec6234a1fba31651cec1ec"} Apr 21 10:11:31.559325 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:31.559287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" event={"ID":"13616c9e-3a87-4f8f-9430-7660bec8e4b3","Type":"ContainerStarted","Data":"c0c2abd8ad6b04898b18f0a3ff35809690fb49f2840574c8e5e9408d5124f9d0"} Apr 21 10:11:31.559706 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:31.559370 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:31.560626 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:31.560604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" event={"ID":"f9c469d6-8aff-4db3-b677-2f115a00ad56","Type":"ContainerStarted","Data":"2a03c6eb196f41865b25c6af974947f4ec5d488d35a33cea8ad606c4de5d9d8f"} Apr 21 10:11:31.560715 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:31.560703 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:11:31.581739 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:31.581696 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" podStartSLOduration=8.934272003 podStartE2EDuration="12.581682694s" podCreationTimestamp="2026-04-21 10:11:19 +0000 UTC" firstStartedPulling="2026-04-21 10:11:27.3390363 +0000 UTC m=+461.723875204" lastFinishedPulling="2026-04-21 10:11:30.986446992 +0000 UTC m=+465.371285895" observedRunningTime="2026-04-21 10:11:31.57911026 +0000 UTC m=+465.963949187" watchObservedRunningTime="2026-04-21 10:11:31.581682694 +0000 UTC m=+465.966521620" Apr 21 10:11:31.596190 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:31.596153 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" podStartSLOduration=8.704139111 podStartE2EDuration="12.596137926s" podCreationTimestamp="2026-04-21 10:11:19 +0000 UTC" firstStartedPulling="2026-04-21 10:11:27.098846548 +0000 UTC m=+461.483685452" lastFinishedPulling="2026-04-21 10:11:30.99084536 +0000 UTC m=+465.375684267" observedRunningTime="2026-04-21 10:11:31.595610524 +0000 UTC m=+465.980449464" watchObservedRunningTime="2026-04-21 10:11:31.596137926 +0000 UTC m=+465.980976854" Apr 21 10:11:40.524698 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:40.524666 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fgmvg" Apr 21 10:11:42.568212 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:42.568176 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nxc9q" Apr 21 10:11:43.535387 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:43.535349 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-pzcfj" Apr 21 10:11:52.565922 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:11:52.565873 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-gpbs9" Apr 21 10:12:27.242673 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.242586 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84685cd884-qxtfl"] Apr 21 10:12:27.245597 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.245578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:27.248296 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.248270 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-7hnv2\"" Apr 21 10:12:27.248423 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.248397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 10:12:27.249314 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.249297 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 21 10:12:27.249397 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.249301 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 10:12:27.255157 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.255138 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84685cd884-qxtfl"] Apr 21 10:12:27.415740 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.415714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/838ccee4-82c8-4ad7-b967-7896a0c601e0-cert\") pod \"kserve-controller-manager-84685cd884-qxtfl\" (UID: \"838ccee4-82c8-4ad7-b967-7896a0c601e0\") " pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:27.415867 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.415765 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pq6w\" (UniqueName: \"kubernetes.io/projected/838ccee4-82c8-4ad7-b967-7896a0c601e0-kube-api-access-2pq6w\") pod \"kserve-controller-manager-84685cd884-qxtfl\" (UID: \"838ccee4-82c8-4ad7-b967-7896a0c601e0\") " pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:27.516957 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.516875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/838ccee4-82c8-4ad7-b967-7896a0c601e0-cert\") pod \"kserve-controller-manager-84685cd884-qxtfl\" (UID: \"838ccee4-82c8-4ad7-b967-7896a0c601e0\") " pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:27.517074 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.517025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pq6w\" (UniqueName: \"kubernetes.io/projected/838ccee4-82c8-4ad7-b967-7896a0c601e0-kube-api-access-2pq6w\") pod \"kserve-controller-manager-84685cd884-qxtfl\" (UID: \"838ccee4-82c8-4ad7-b967-7896a0c601e0\") " pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:27.519316 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.519286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/838ccee4-82c8-4ad7-b967-7896a0c601e0-cert\") pod \"kserve-controller-manager-84685cd884-qxtfl\" (UID: \"838ccee4-82c8-4ad7-b967-7896a0c601e0\") " pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:27.528125 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.528106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pq6w\" (UniqueName: \"kubernetes.io/projected/838ccee4-82c8-4ad7-b967-7896a0c601e0-kube-api-access-2pq6w\") pod \"kserve-controller-manager-84685cd884-qxtfl\" (UID: \"838ccee4-82c8-4ad7-b967-7896a0c601e0\") " pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:27.555917 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.555872 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:27.672596 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.672565 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84685cd884-qxtfl"] Apr 21 10:12:27.675731 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:12:27.675704 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod838ccee4_82c8_4ad7_b967_7896a0c601e0.slice/crio-02aa6c82e82426dcc380542dc798b9d551786d9a3ec0c577e7463bf9d0bd5ce9 WatchSource:0}: Error finding container 02aa6c82e82426dcc380542dc798b9d551786d9a3ec0c577e7463bf9d0bd5ce9: Status 404 returned error can't find the container with id 02aa6c82e82426dcc380542dc798b9d551786d9a3ec0c577e7463bf9d0bd5ce9 Apr 21 10:12:27.737141 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:27.737113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" event={"ID":"838ccee4-82c8-4ad7-b967-7896a0c601e0","Type":"ContainerStarted","Data":"02aa6c82e82426dcc380542dc798b9d551786d9a3ec0c577e7463bf9d0bd5ce9"} Apr 21 10:12:30.747600 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:30.747552 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" event={"ID":"838ccee4-82c8-4ad7-b967-7896a0c601e0","Type":"ContainerStarted","Data":"bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014"} Apr 21 10:12:30.748095 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:30.747639 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:12:30.765263 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:12:30.765217 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" podStartSLOduration=1.322123381 podStartE2EDuration="3.765204142s" podCreationTimestamp="2026-04-21 10:12:27 +0000 UTC" firstStartedPulling="2026-04-21 10:12:27.677293572 +0000 UTC m=+522.062132479" lastFinishedPulling="2026-04-21 10:12:30.120374333 +0000 UTC m=+524.505213240" observedRunningTime="2026-04-21 10:12:30.763643706 +0000 UTC m=+525.148482656" watchObservedRunningTime="2026-04-21 10:12:30.765204142 +0000 UTC m=+525.150043134" Apr 21 10:13:01.756254 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:01.756220 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:13:03.117271 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.117225 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84685cd884-qxtfl"] Apr 21 10:13:03.117767 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.117535 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" podUID="838ccee4-82c8-4ad7-b967-7896a0c601e0" containerName="manager" containerID="cri-o://bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014" gracePeriod=10 Apr 21 10:13:03.142476 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.142450 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84685cd884-d5plh"] Apr 21 10:13:03.145742 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.145724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:03.156188 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.156161 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84685cd884-d5plh"] Apr 21 10:13:03.158846 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.158818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81d5010e-67e4-4592-93c7-cdf9fb0c4282-cert\") pod \"kserve-controller-manager-84685cd884-d5plh\" (UID: \"81d5010e-67e4-4592-93c7-cdf9fb0c4282\") " pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:03.158986 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.158869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfxr9\" (UniqueName: \"kubernetes.io/projected/81d5010e-67e4-4592-93c7-cdf9fb0c4282-kube-api-access-lfxr9\") pod \"kserve-controller-manager-84685cd884-d5plh\" (UID: \"81d5010e-67e4-4592-93c7-cdf9fb0c4282\") " pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:03.259805 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.259778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81d5010e-67e4-4592-93c7-cdf9fb0c4282-cert\") pod \"kserve-controller-manager-84685cd884-d5plh\" (UID: \"81d5010e-67e4-4592-93c7-cdf9fb0c4282\") " pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:03.259923 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.259843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfxr9\" (UniqueName: \"kubernetes.io/projected/81d5010e-67e4-4592-93c7-cdf9fb0c4282-kube-api-access-lfxr9\") pod \"kserve-controller-manager-84685cd884-d5plh\" (UID: \"81d5010e-67e4-4592-93c7-cdf9fb0c4282\") " pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:03.262516 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.262495 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81d5010e-67e4-4592-93c7-cdf9fb0c4282-cert\") pod \"kserve-controller-manager-84685cd884-d5plh\" (UID: \"81d5010e-67e4-4592-93c7-cdf9fb0c4282\") " pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:03.268919 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.268901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfxr9\" (UniqueName: \"kubernetes.io/projected/81d5010e-67e4-4592-93c7-cdf9fb0c4282-kube-api-access-lfxr9\") pod \"kserve-controller-manager-84685cd884-d5plh\" (UID: \"81d5010e-67e4-4592-93c7-cdf9fb0c4282\") " pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:03.349042 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.349024 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:13:03.360289 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.360265 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/838ccee4-82c8-4ad7-b967-7896a0c601e0-cert\") pod \"838ccee4-82c8-4ad7-b967-7896a0c601e0\" (UID: \"838ccee4-82c8-4ad7-b967-7896a0c601e0\") " Apr 21 10:13:03.360389 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.360310 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pq6w\" (UniqueName: \"kubernetes.io/projected/838ccee4-82c8-4ad7-b967-7896a0c601e0-kube-api-access-2pq6w\") pod \"838ccee4-82c8-4ad7-b967-7896a0c601e0\" (UID: \"838ccee4-82c8-4ad7-b967-7896a0c601e0\") " Apr 21 10:13:03.362296 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.362272 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/838ccee4-82c8-4ad7-b967-7896a0c601e0-kube-api-access-2pq6w" (OuterVolumeSpecName: "kube-api-access-2pq6w") pod "838ccee4-82c8-4ad7-b967-7896a0c601e0" (UID: "838ccee4-82c8-4ad7-b967-7896a0c601e0"). InnerVolumeSpecName "kube-api-access-2pq6w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:13:03.362296 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.362289 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838ccee4-82c8-4ad7-b967-7896a0c601e0-cert" (OuterVolumeSpecName: "cert") pod "838ccee4-82c8-4ad7-b967-7896a0c601e0" (UID: "838ccee4-82c8-4ad7-b967-7896a0c601e0"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:13:03.461432 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.461411 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/838ccee4-82c8-4ad7-b967-7896a0c601e0-cert\") on node \"ip-10-0-137-205.ec2.internal\" DevicePath \"\"" Apr 21 10:13:03.461535 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.461435 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pq6w\" (UniqueName: \"kubernetes.io/projected/838ccee4-82c8-4ad7-b967-7896a0c601e0-kube-api-access-2pq6w\") on node \"ip-10-0-137-205.ec2.internal\" DevicePath \"\"" Apr 21 10:13:03.493960 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.493937 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:03.607390 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.607363 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84685cd884-d5plh"] Apr 21 10:13:03.609705 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:13:03.609680 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81d5010e_67e4_4592_93c7_cdf9fb0c4282.slice/crio-85ff3dc53b2ec49d7b5cdbd591e67ba687c35929b6b0f55487e4e044757b35fe WatchSource:0}: Error finding container 85ff3dc53b2ec49d7b5cdbd591e67ba687c35929b6b0f55487e4e044757b35fe: Status 404 returned error can't find the container with id 85ff3dc53b2ec49d7b5cdbd591e67ba687c35929b6b0f55487e4e044757b35fe Apr 21 10:13:03.854366 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.854301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84685cd884-d5plh" event={"ID":"81d5010e-67e4-4592-93c7-cdf9fb0c4282","Type":"ContainerStarted","Data":"85ff3dc53b2ec49d7b5cdbd591e67ba687c35929b6b0f55487e4e044757b35fe"} Apr 21 10:13:03.855373 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.855349 2575 generic.go:358] "Generic (PLEG): container finished" podID="838ccee4-82c8-4ad7-b967-7896a0c601e0" containerID="bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014" exitCode=0 Apr 21 10:13:03.855484 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.855398 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" Apr 21 10:13:03.855484 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.855408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" event={"ID":"838ccee4-82c8-4ad7-b967-7896a0c601e0","Type":"ContainerDied","Data":"bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014"} Apr 21 10:13:03.855484 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.855442 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84685cd884-qxtfl" event={"ID":"838ccee4-82c8-4ad7-b967-7896a0c601e0","Type":"ContainerDied","Data":"02aa6c82e82426dcc380542dc798b9d551786d9a3ec0c577e7463bf9d0bd5ce9"} Apr 21 10:13:03.855484 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.855464 2575 scope.go:117] "RemoveContainer" containerID="bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014" Apr 21 10:13:03.863474 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.863453 2575 scope.go:117] "RemoveContainer" containerID="bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014" Apr 21 10:13:03.863731 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:13:03.863711 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014\": container with ID starting with bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014 not found: ID does not exist" containerID="bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014" Apr 21 10:13:03.863804 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.863744 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014"} err="failed to get container status \"bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014\": rpc error: code = NotFound desc = could not find container \"bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014\": container with ID starting with bb7c01cfeadc7ded4d7a0e8f8144a419393a320d5cfbca70325a68a07845c014 not found: ID does not exist" Apr 21 10:13:03.874719 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.874698 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84685cd884-qxtfl"] Apr 21 10:13:03.878586 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:03.878563 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84685cd884-qxtfl"] Apr 21 10:13:04.222352 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:04.222319 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="838ccee4-82c8-4ad7-b967-7896a0c601e0" path="/var/lib/kubelet/pods/838ccee4-82c8-4ad7-b967-7896a0c601e0/volumes" Apr 21 10:13:04.860437 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:04.860398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84685cd884-d5plh" event={"ID":"81d5010e-67e4-4592-93c7-cdf9fb0c4282","Type":"ContainerStarted","Data":"91bf39659964e110cdacfcaf33a1607ce86bb939f9960e26e1b94022a9722230"} Apr 21 10:13:04.860627 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:04.860588 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:04.876544 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:04.876501 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84685cd884-d5plh" podStartSLOduration=1.48274263 podStartE2EDuration="1.876490521s" podCreationTimestamp="2026-04-21 10:13:03 +0000 UTC" firstStartedPulling="2026-04-21 10:13:03.610840025 +0000 UTC m=+557.995678928" lastFinishedPulling="2026-04-21 10:13:04.004587912 +0000 UTC m=+558.389426819" observedRunningTime="2026-04-21 10:13:04.875736186 +0000 UTC m=+559.260575113" watchObservedRunningTime="2026-04-21 10:13:04.876490521 +0000 UTC m=+559.261329495" Apr 21 10:13:35.871703 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:35.871670 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84685cd884-d5plh" Apr 21 10:13:36.672797 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.672769 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-dcbms"] Apr 21 10:13:36.673084 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.673071 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="838ccee4-82c8-4ad7-b967-7896a0c601e0" containerName="manager" Apr 21 10:13:36.673136 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.673086 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="838ccee4-82c8-4ad7-b967-7896a0c601e0" containerName="manager" Apr 21 10:13:36.673170 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.673149 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="838ccee4-82c8-4ad7-b967-7896a0c601e0" containerName="manager" Apr 21 10:13:36.675989 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.675970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:36.680195 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.680175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-h7w2x\"" Apr 21 10:13:36.681714 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.681690 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 21 10:13:36.690313 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.690293 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dcbms"] Apr 21 10:13:36.696996 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.696978 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-8zfdj"] Apr 21 10:13:36.699903 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.699871 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:36.703349 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.703332 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 21 10:13:36.703506 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.703489 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-7km5n\"" Apr 21 10:13:36.720303 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.720282 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8zfdj"] Apr 21 10:13:36.784474 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.784452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27wn\" (UniqueName: \"kubernetes.io/projected/8615101b-62cb-4ea6-a79b-0c6da78e212e-kube-api-access-k27wn\") pod \"model-serving-api-86f7b4b499-dcbms\" (UID: \"8615101b-62cb-4ea6-a79b-0c6da78e212e\") " pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:36.784591 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.784493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/886ea626-b814-4d74-b575-9a067632751b-cert\") pod \"odh-model-controller-696fc77849-8zfdj\" (UID: \"886ea626-b814-4d74-b575-9a067632751b\") " pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:36.784591 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.784521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpdz4\" (UniqueName: \"kubernetes.io/projected/886ea626-b814-4d74-b575-9a067632751b-kube-api-access-mpdz4\") pod \"odh-model-controller-696fc77849-8zfdj\" (UID: \"886ea626-b814-4d74-b575-9a067632751b\") " pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:36.784591 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.784561 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8615101b-62cb-4ea6-a79b-0c6da78e212e-tls-certs\") pod \"model-serving-api-86f7b4b499-dcbms\" (UID: \"8615101b-62cb-4ea6-a79b-0c6da78e212e\") " pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:36.885121 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.885095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k27wn\" (UniqueName: \"kubernetes.io/projected/8615101b-62cb-4ea6-a79b-0c6da78e212e-kube-api-access-k27wn\") pod \"model-serving-api-86f7b4b499-dcbms\" (UID: \"8615101b-62cb-4ea6-a79b-0c6da78e212e\") " pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:36.885479 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.885127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/886ea626-b814-4d74-b575-9a067632751b-cert\") pod \"odh-model-controller-696fc77849-8zfdj\" (UID: \"886ea626-b814-4d74-b575-9a067632751b\") " pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:36.885479 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.885149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpdz4\" (UniqueName: \"kubernetes.io/projected/886ea626-b814-4d74-b575-9a067632751b-kube-api-access-mpdz4\") pod \"odh-model-controller-696fc77849-8zfdj\" (UID: \"886ea626-b814-4d74-b575-9a067632751b\") " pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:36.885479 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.885277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8615101b-62cb-4ea6-a79b-0c6da78e212e-tls-certs\") pod \"model-serving-api-86f7b4b499-dcbms\" (UID: \"8615101b-62cb-4ea6-a79b-0c6da78e212e\") " pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:36.887445 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.887424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/886ea626-b814-4d74-b575-9a067632751b-cert\") pod \"odh-model-controller-696fc77849-8zfdj\" (UID: \"886ea626-b814-4d74-b575-9a067632751b\") " pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:36.887535 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.887473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8615101b-62cb-4ea6-a79b-0c6da78e212e-tls-certs\") pod \"model-serving-api-86f7b4b499-dcbms\" (UID: \"8615101b-62cb-4ea6-a79b-0c6da78e212e\") " pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:36.895487 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.895460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27wn\" (UniqueName: \"kubernetes.io/projected/8615101b-62cb-4ea6-a79b-0c6da78e212e-kube-api-access-k27wn\") pod \"model-serving-api-86f7b4b499-dcbms\" (UID: \"8615101b-62cb-4ea6-a79b-0c6da78e212e\") " pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:36.895676 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.895660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpdz4\" (UniqueName: \"kubernetes.io/projected/886ea626-b814-4d74-b575-9a067632751b-kube-api-access-mpdz4\") pod \"odh-model-controller-696fc77849-8zfdj\" (UID: \"886ea626-b814-4d74-b575-9a067632751b\") " pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:36.986020 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:36.985967 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:37.009192 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:37.009166 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:37.113919 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:37.113874 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dcbms"] Apr 21 10:13:37.117361 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:13:37.117331 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8615101b_62cb_4ea6_a79b_0c6da78e212e.slice/crio-c0254f7132c8b4e300b35f8ce42e6cf5f2a0875339f7a11ac86880f9c28c3c26 WatchSource:0}: Error finding container c0254f7132c8b4e300b35f8ce42e6cf5f2a0875339f7a11ac86880f9c28c3c26: Status 404 returned error can't find the container with id c0254f7132c8b4e300b35f8ce42e6cf5f2a0875339f7a11ac86880f9c28c3c26 Apr 21 10:13:37.134248 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:37.134223 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8zfdj"] Apr 21 10:13:37.136613 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:13:37.136586 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886ea626_b814_4d74_b575_9a067632751b.slice/crio-0b197bd942f47e16cd80ef460b068854a99f7c2dc48d6f252fd2380ede2b7929 WatchSource:0}: Error finding container 0b197bd942f47e16cd80ef460b068854a99f7c2dc48d6f252fd2380ede2b7929: Status 404 returned error can't find the container with id 0b197bd942f47e16cd80ef460b068854a99f7c2dc48d6f252fd2380ede2b7929 Apr 21 10:13:37.972305 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:37.972259 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8zfdj" event={"ID":"886ea626-b814-4d74-b575-9a067632751b","Type":"ContainerStarted","Data":"0b197bd942f47e16cd80ef460b068854a99f7c2dc48d6f252fd2380ede2b7929"} Apr 21 10:13:37.973495 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:37.973458 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dcbms" event={"ID":"8615101b-62cb-4ea6-a79b-0c6da78e212e","Type":"ContainerStarted","Data":"c0254f7132c8b4e300b35f8ce42e6cf5f2a0875339f7a11ac86880f9c28c3c26"} Apr 21 10:13:40.985974 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:40.985939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8zfdj" event={"ID":"886ea626-b814-4d74-b575-9a067632751b","Type":"ContainerStarted","Data":"801269ff4fdeb83b2860f8fa9de7485db290ecf37ce60821fde5894ac2e1544d"} Apr 21 10:13:40.986313 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:40.986172 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:40.987305 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:40.987284 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dcbms" event={"ID":"8615101b-62cb-4ea6-a79b-0c6da78e212e","Type":"ContainerStarted","Data":"5384b9c391794d664ad8bf8bef8e5b0eaab23c8341cdfc2ef6f29d84b63c49b8"} Apr 21 10:13:40.987423 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:40.987406 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:13:41.005007 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:41.004966 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-8zfdj" podStartSLOduration=1.321763051 podStartE2EDuration="5.004954931s" podCreationTimestamp="2026-04-21 10:13:36 +0000 UTC" firstStartedPulling="2026-04-21 10:13:37.137758108 +0000 UTC m=+591.522597012" lastFinishedPulling="2026-04-21 10:13:40.820949976 +0000 UTC m=+595.205788892" observedRunningTime="2026-04-21 10:13:41.004113839 +0000 UTC m=+595.388952764" watchObservedRunningTime="2026-04-21 10:13:41.004954931 +0000 UTC m=+595.389793858" Apr 21 10:13:41.021868 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:41.021816 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-dcbms" podStartSLOduration=1.26906263 podStartE2EDuration="5.021806926s" podCreationTimestamp="2026-04-21 10:13:36 +0000 UTC" firstStartedPulling="2026-04-21 10:13:37.119473953 +0000 UTC m=+591.504312857" lastFinishedPulling="2026-04-21 10:13:40.872218246 +0000 UTC m=+595.257057153" observedRunningTime="2026-04-21 10:13:41.020126754 +0000 UTC m=+595.404965680" watchObservedRunningTime="2026-04-21 10:13:41.021806926 +0000 UTC m=+595.406645852" Apr 21 10:13:46.139614 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:46.139578 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:13:46.140105 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:46.139820 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:13:51.993699 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:51.993665 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-8zfdj" Apr 21 10:13:51.995701 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:13:51.995679 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-dcbms" Apr 21 10:14:48.759192 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.759158 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-75rh9"] Apr 21 10:14:48.762320 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.762305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:48.764911 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.764869 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 21 10:14:48.765038 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.765014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 21 10:14:48.765118 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.765099 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mv2mf\"" Apr 21 10:14:48.768843 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.768822 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-75rh9"] Apr 21 10:14:48.786556 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.786527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0f471317-2411-469f-ba6c-89a68eba258f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:48.786642 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.786570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:48.786696 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.786672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnq6m\" (UniqueName: \"kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-kube-api-access-lnq6m\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:48.887893 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.887857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnq6m\" (UniqueName: \"kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-kube-api-access-lnq6m\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:48.888005 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.887928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0f471317-2411-469f-ba6c-89a68eba258f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:48.888005 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.887957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:48.888112 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:14:48.888054 2575 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 21 10:14:48.888112 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:14:48.888065 2575 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-75rh9: secret "seaweedfs-tls-serving" not found Apr 21 10:14:48.888112 ip-10-0-137-205 kubenswrapper[2575]: E0421 10:14:48.888111 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-seaweedfs-tls-serving podName:0f471317-2411-469f-ba6c-89a68eba258f nodeName:}" failed. No retries permitted until 2026-04-21 10:14:49.388096237 +0000 UTC m=+663.772935142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-75rh9" (UID: "0f471317-2411-469f-ba6c-89a68eba258f") : secret "seaweedfs-tls-serving" not found Apr 21 10:14:48.888296 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.888278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0f471317-2411-469f-ba6c-89a68eba258f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:48.896575 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:48.896550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnq6m\" (UniqueName: \"kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-kube-api-access-lnq6m\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:49.391295 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:49.391264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:49.393658 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:49.393627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/0f471317-2411-469f-ba6c-89a68eba258f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-75rh9\" (UID: \"0f471317-2411-469f-ba6c-89a68eba258f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:49.672716 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:49.672692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" Apr 21 10:14:49.792961 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:49.792934 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-75rh9"] Apr 21 10:14:49.795829 ip-10-0-137-205 kubenswrapper[2575]: W0421 10:14:49.795801 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f471317_2411_469f_ba6c_89a68eba258f.slice/crio-81bb01cf5c5311a3a3b0282339c6e0f99c16fd09a2b12442244bf85c2cd43c5d WatchSource:0}: Error finding container 81bb01cf5c5311a3a3b0282339c6e0f99c16fd09a2b12442244bf85c2cd43c5d: Status 404 returned error can't find the container with id 81bb01cf5c5311a3a3b0282339c6e0f99c16fd09a2b12442244bf85c2cd43c5d Apr 21 10:14:50.225777 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:50.225740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" event={"ID":"0f471317-2411-469f-ba6c-89a68eba258f","Type":"ContainerStarted","Data":"81bb01cf5c5311a3a3b0282339c6e0f99c16fd09a2b12442244bf85c2cd43c5d"} Apr 21 10:14:53.232433 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:53.232397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" event={"ID":"0f471317-2411-469f-ba6c-89a68eba258f","Type":"ContainerStarted","Data":"fa4fe33137e9695af33846cac358c479dd20af7446fad829c9e7ac9bc23a4fe9"} Apr 21 10:14:53.247983 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:14:53.247940 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-75rh9" podStartSLOduration=2.7626552970000002 podStartE2EDuration="5.247927156s" podCreationTimestamp="2026-04-21 10:14:48 +0000 UTC" firstStartedPulling="2026-04-21 10:14:49.797276589 +0000 UTC m=+664.182115508" lastFinishedPulling="2026-04-21 10:14:52.282548462 +0000 UTC m=+666.667387367" observedRunningTime="2026-04-21 10:14:53.246627174 +0000 UTC m=+667.631466103" watchObservedRunningTime="2026-04-21 10:14:53.247927156 +0000 UTC m=+667.632766082" Apr 21 10:18:46.163138 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:18:46.163061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:18:46.164786 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:18:46.164763 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:23:46.184958 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:23:46.184922 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:23:46.189042 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:23:46.189018 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:28:46.205576 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:28:46.205547 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:28:46.210404 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:28:46.210375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:33:46.226978 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:33:46.226945 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:33:46.232672 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:33:46.232652 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:38:46.252654 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:38:46.252628 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:38:46.259122 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:38:46.259102 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:43:46.272997 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:43:46.272964 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:43:46.280561 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:43:46.280538 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:48:46.293673 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:48:46.293600 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:48:46.301130 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:48:46.301113 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:53:46.314843 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:53:46.314737 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:53:46.322924 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:53:46.322900 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:58:46.340216 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:58:46.340106 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 10:58:46.345938 ip-10-0-137-205 kubenswrapper[2575]: I0421 10:58:46.345917 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 11:03:46.361089 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:03:46.360964 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 11:03:46.367034 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:03:46.367016 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 11:08:46.381232 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:08:46.381127 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 11:08:46.386939 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:08:46.386918 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 11:10:23.142961 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:23.142931 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m5ktd_43b65a7c-65a4-4a35-8d96-82a1e1a9288d/global-pull-secret-syncer/0.log" Apr 21 11:10:23.349961 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:23.349930 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qlmqr_629199d2-db97-402a-9f00-3541bd582211/konnectivity-agent/0.log" Apr 21 11:10:23.391244 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:23.391217 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-205.ec2.internal_2aa63aeeee1f49e020d8254357661d3b/haproxy/0.log" Apr 21 11:10:27.090614 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.090584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-5d7nd_dc8f7796-af2d-4cc6-8807-bb8dcf779711/cluster-monitoring-operator/0.log" Apr 21 11:10:27.251474 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.251443 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qg446_a12c9b59-e4c0-4ecf-b7ef-27e880596b38/node-exporter/0.log" Apr 21 11:10:27.273957 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.273932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qg446_a12c9b59-e4c0-4ecf-b7ef-27e880596b38/kube-rbac-proxy/0.log" Apr 21 11:10:27.298607 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.298582 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qg446_a12c9b59-e4c0-4ecf-b7ef-27e880596b38/init-textfile/0.log" Apr 21 11:10:27.471233 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.471199 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zljwr_2fad54c4-d0f0-4d68-8efb-bf41e5c630ec/kube-rbac-proxy-main/0.log" Apr 21 11:10:27.493202 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.493178 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zljwr_2fad54c4-d0f0-4d68-8efb-bf41e5c630ec/kube-rbac-proxy-self/0.log" Apr 21 11:10:27.522593 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.522571 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zljwr_2fad54c4-d0f0-4d68-8efb-bf41e5c630ec/openshift-state-metrics/0.log" Apr 21 11:10:27.744915 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.744813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bc76w_b5ad8623-c5f0-42d4-bcc4-13ca87efed63/prometheus-operator/0.log" Apr 21 11:10:27.763425 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:27.763403 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bc76w_b5ad8623-c5f0-42d4-bcc4-13ca87efed63/kube-rbac-proxy/0.log" Apr 21 11:10:29.803068 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.803037 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl"] Apr 21 11:10:29.806263 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.806243 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.808730 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.808706 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x9mhh\"/\"openshift-service-ca.crt\"" Apr 21 11:10:29.808827 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.808762 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x9mhh\"/\"kube-root-ca.crt\"" Apr 21 11:10:29.809653 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.809637 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x9mhh\"/\"default-dockercfg-bdbtk\"" Apr 21 11:10:29.815096 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.815076 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl"] Apr 21 11:10:29.884185 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.884159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw8jd\" (UniqueName: \"kubernetes.io/projected/104a80c1-a882-4592-bb3a-4bca10f1f967-kube-api-access-cw8jd\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.884280 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.884194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-sys\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.884280 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.884212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-lib-modules\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.884280 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.884270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-proc\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.884394 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.884302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-podres\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985255 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985231 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-sys\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985340 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-lib-modules\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985340 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-proc\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985340 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-podres\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985472 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw8jd\" (UniqueName: \"kubernetes.io/projected/104a80c1-a882-4592-bb3a-4bca10f1f967-kube-api-access-cw8jd\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985472 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-proc\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985472 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-sys\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985472 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-lib-modules\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.985472 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.985420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/104a80c1-a882-4592-bb3a-4bca10f1f967-podres\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:29.993140 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:29.993114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw8jd\" (UniqueName: \"kubernetes.io/projected/104a80c1-a882-4592-bb3a-4bca10f1f967-kube-api-access-cw8jd\") pod \"perf-node-gather-daemonset-kstgl\" (UID: \"104a80c1-a882-4592-bb3a-4bca10f1f967\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:30.116055 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:30.115982 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:30.437994 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:30.437968 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl"] Apr 21 11:10:30.440448 ip-10-0-137-205 kubenswrapper[2575]: W0421 11:10:30.440418 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod104a80c1_a882_4592_bb3a_4bca10f1f967.slice/crio-b32531bff306b5ecff6381029a0ba45c57c289762dbfcb17d66655cf74067066 WatchSource:0}: Error finding container b32531bff306b5ecff6381029a0ba45c57c289762dbfcb17d66655cf74067066: Status 404 returned error can't find the container with id b32531bff306b5ecff6381029a0ba45c57c289762dbfcb17d66655cf74067066 Apr 21 11:10:30.442069 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:30.442052 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 11:10:30.971932 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:30.971891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" event={"ID":"104a80c1-a882-4592-bb3a-4bca10f1f967","Type":"ContainerStarted","Data":"3d83307367b65c9af4c0ae48153664f4230436df86d6af0489260ee47e2b4e6b"} Apr 21 11:10:30.971932 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:30.971935 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" event={"ID":"104a80c1-a882-4592-bb3a-4bca10f1f967","Type":"ContainerStarted","Data":"b32531bff306b5ecff6381029a0ba45c57c289762dbfcb17d66655cf74067066"} Apr 21 11:10:30.972343 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:30.972004 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:30.992757 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:30.992703 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" podStartSLOduration=1.9926879309999999 podStartE2EDuration="1.992687931s" podCreationTimestamp="2026-04-21 11:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:10:30.992566077 +0000 UTC m=+4005.377405003" watchObservedRunningTime="2026-04-21 11:10:30.992687931 +0000 UTC m=+4005.377526839" Apr 21 11:10:31.097264 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:31.097236 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pwrsm_4004e34e-ae2c-4815-bea5-806b5e15036f/dns/0.log" Apr 21 11:10:31.120979 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:31.120951 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pwrsm_4004e34e-ae2c-4815-bea5-806b5e15036f/kube-rbac-proxy/0.log" Apr 21 11:10:31.177600 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:31.177571 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9nzkd_15e1aa47-50b2-4765-a870-9c646ae4fb01/dns-node-resolver/0.log" Apr 21 11:10:31.588429 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:31.588394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-779dbdc744-bx9cs_c491f05b-ff00-4798-a36d-4e55bd12f403/registry/0.log" Apr 21 11:10:31.634543 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:31.634514 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7p7gp_bf91954e-2bd6-4597-b660-140925e88c87/node-ca/0.log" Apr 21 11:10:32.330748 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:32.330716 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5578cc5b5b-gktb6_d9d0fea7-43c5-4e4f-a753-45edf112980c/router/0.log" Apr 21 11:10:32.663242 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:32.663176 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-968lb_1532c4fc-8a96-448b-954a-32eca5cac710/serve-healthcheck-canary/0.log" Apr 21 11:10:33.025714 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:33.025690 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gzcbd_95ed4025-e077-4170-b387-681d37c22925/insights-operator/0.log" Apr 21 11:10:33.027018 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:33.026999 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gzcbd_95ed4025-e077-4170-b387-681d37c22925/insights-operator/1.log" Apr 21 11:10:33.047614 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:33.047578 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6pxnr_2fb17919-bd4a-48df-a399-d9c623b57ea9/kube-rbac-proxy/0.log" Apr 21 11:10:33.069274 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:33.069244 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6pxnr_2fb17919-bd4a-48df-a399-d9c623b57ea9/exporter/0.log" Apr 21 11:10:33.091051 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:33.091026 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6pxnr_2fb17919-bd4a-48df-a399-d9c623b57ea9/extractor/0.log" Apr 21 11:10:35.247154 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:35.247122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84685cd884-d5plh_81d5010e-67e4-4592-93c7-cdf9fb0c4282/manager/0.log" Apr 21 11:10:35.360339 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:35.360313 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-dcbms_8615101b-62cb-4ea6-a79b-0c6da78e212e/server/0.log" Apr 21 11:10:35.572503 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:35.572425 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-8zfdj_886ea626-b814-4d74-b575-9a067632751b/manager/0.log" Apr 21 11:10:35.724503 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:35.724477 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-75rh9_0f471317-2411-469f-ba6c-89a68eba258f/seaweedfs-tls-serving/0.log" Apr 21 11:10:36.985357 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:36.985331 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-kstgl" Apr 21 11:10:41.358387 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.358317 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xmljw_d4d5b62e-5966-4d1f-b4ca-61a8acf47843/kube-multus-additional-cni-plugins/0.log" Apr 21 11:10:41.385417 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.385394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xmljw_d4d5b62e-5966-4d1f-b4ca-61a8acf47843/egress-router-binary-copy/0.log" Apr 21 11:10:41.410968 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.410946 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xmljw_d4d5b62e-5966-4d1f-b4ca-61a8acf47843/cni-plugins/0.log" Apr 21 11:10:41.434211 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.434193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xmljw_d4d5b62e-5966-4d1f-b4ca-61a8acf47843/bond-cni-plugin/0.log" Apr 21 11:10:41.462433 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.462411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xmljw_d4d5b62e-5966-4d1f-b4ca-61a8acf47843/routeoverride-cni/0.log" Apr 21 11:10:41.484301 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.484280 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xmljw_d4d5b62e-5966-4d1f-b4ca-61a8acf47843/whereabouts-cni-bincopy/0.log" Apr 21 11:10:41.509706 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.509679 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xmljw_d4d5b62e-5966-4d1f-b4ca-61a8acf47843/whereabouts-cni/0.log" Apr 21 11:10:41.539412 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.539388 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbjqk_eac76dbb-f531-4a5b-a588-7457fe7db6c4/kube-multus/0.log" Apr 21 11:10:41.710370 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.710339 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z7xfz_1100ece6-afda-453b-8595-490c94dbb90d/network-metrics-daemon/0.log" Apr 21 11:10:41.733655 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:41.733626 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z7xfz_1100ece6-afda-453b-8595-490c94dbb90d/kube-rbac-proxy/0.log" Apr 21 11:10:43.355556 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.355524 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-controller/0.log" Apr 21 11:10:43.374538 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.374515 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/0.log" Apr 21 11:10:43.391664 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.391642 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovn-acl-logging/1.log" Apr 21 11:10:43.411164 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.411138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/kube-rbac-proxy-node/0.log" Apr 21 11:10:43.433384 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.433363 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 11:10:43.451985 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.451962 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/northd/0.log" Apr 21 11:10:43.477749 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.477728 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/nbdb/0.log" Apr 21 11:10:43.499891 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.499855 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/sbdb/0.log" Apr 21 11:10:43.598191 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:43.598164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w88kn_78e28615-5373-42a5-a30d-cd814a0943b4/ovnkube-controller/0.log" Apr 21 11:10:44.570420 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:44.570390 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-gpv7b_08e569b3-eeff-4b72-b452-72b801cbdb72/check-endpoints/0.log" Apr 21 11:10:44.597091 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:44.597067 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4kjcv_983ab4c3-e7a5-4914-b247-d139ed1699ad/network-check-target-container/0.log" Apr 21 11:10:45.601112 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:45.601066 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xdnxh_99b900e4-2c7f-4af0-bb0e-3e9eef7571a5/iptables-alerter/0.log" Apr 21 11:10:46.269923 ip-10-0-137-205 kubenswrapper[2575]: I0421 11:10:46.269894 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zpdx2_cabbe82c-cbeb-4577-9474-2dce5128f826/tuned/0.log"