Apr 22 18:17:59.714429 ip-10-0-136-5 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:17:59.714442 ip-10-0-136-5 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:17:59.714452 ip-10-0-136-5 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:17:59.714753 ip-10-0-136-5 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:18:09.789242 ip-10-0-136-5 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:18:09.789262 ip-10-0-136-5 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6d426c2a3f2c4b1f9fe5b76e4912940a -- Apr 22 18:20:50.718825 ip-10-0-136-5 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:20:51.114790 ip-10-0-136-5 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:51.114790 ip-10-0-136-5 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:20:51.114790 ip-10-0-136-5 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:51.114790 ip-10-0-136-5 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:20:51.114790 ip-10-0-136-5 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:51.116369 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.116284 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:20:51.118746 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118732 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:51.118746 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118746 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118750 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118753 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118756 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118760 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118762 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118765 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118768 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118771 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118774 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118777 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118779 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118782 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118785 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118787 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118790 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118793 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118796 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118799 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118801 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:51.118808 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118804 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118806 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118809 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118812 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118815 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118818 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118821 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118824 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118827 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118829 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118832 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118836 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118839 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118842 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118845 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118847 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118850 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118852 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118855 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:51.119278 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118857 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118860 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118862 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118865 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118867 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118869 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118873 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118877 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118880 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118882 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118885 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118887 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118890 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118893 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118895 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118898 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118901 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118903 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118906 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118909 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:51.119762 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118911 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118916 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118919 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118922 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118925 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118927 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118930 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118933 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118935 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118938 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118941 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118943 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118945 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118949 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118951 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118954 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118957 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118959 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118962 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:51.120235 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118964 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118967 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118970 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118972 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118975 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118977 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.118980 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119355 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119361 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119364 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119367 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119370 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119373 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119375 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119378 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119381 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119384 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119387 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119389 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:51.120714 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119392 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119396 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119400 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119403 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119406 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119409 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119412 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119414 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119417 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119419 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119422 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119424 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119427 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119429 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119432 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119435 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119437 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119440 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119442 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119445 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:51.121152 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119447 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119450 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119453 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119456 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119458 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119462 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119465 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119468 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119470 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119473 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119475 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119478 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119480 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119483 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119485 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119487 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119490 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119492 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119495 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119497 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:51.121737 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119500 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119502 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119505 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119507 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119511 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119513 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119516 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119518 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119521 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119523 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119526 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119528 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119531 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119533 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119536 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119539 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119542 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119547 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119549 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:51.122216 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119552 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119555 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119557 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119575 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119577 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119580 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119582 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119585 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119588 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119590 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119593 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119596 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119598 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119601 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.119603 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120180 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120188 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120193 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120197 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120206 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120209 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:20:51.122693 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120214 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120218 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120221 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120224 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120228 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120231 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120234 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120237 2580 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120240 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120244 2580 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120247 2580 flags.go:64] FLAG: --cloud-config="" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120250 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120253 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120258 2580 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120261 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120264 2580 flags.go:64] FLAG: --config-dir="" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120267 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120271 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120274 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120277 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120281 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120284 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120287 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120290 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:20:51.123223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120293 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120296 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120298 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120303 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120305 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120308 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120311 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120314 2580 flags.go:64] FLAG: --enable-server="true" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120317 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120321 2580 flags.go:64] FLAG: --event-burst="100" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120324 2580 flags.go:64] FLAG: --event-qps="50" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120327 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120330 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120333 2580 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120338 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120340 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120343 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120346 2580 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120349 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120352 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120355 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120358 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120361 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120364 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120367 2580 flags.go:64] FLAG: --feature-gates="" Apr 22 18:20:51.123823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120370 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120373 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120376 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120383 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120386 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120389 2580 flags.go:64] FLAG: --help="false" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120392 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120395 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120397 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120400 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120403 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120407 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120410 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120412 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120415 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120418 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120421 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120424 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120427 2580 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120430 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120433 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120436 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120439 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120441 2580 flags.go:64] FLAG: --lock-file="" Apr 22 18:20:51.124460 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120444 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120447 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120450 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120456 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120458 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120461 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120465 2580 flags.go:64] FLAG: --logging-format="text" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120467 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120471 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120473 2580 flags.go:64] FLAG: --manifest-url="" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120476 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120482 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120485 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120488 2580 flags.go:64] FLAG: --max-pods="110" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120491 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120494 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120497 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120500 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120503 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120506 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120509 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120515 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120518 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120521 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:20:51.125069 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120525 2580 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120527 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120533 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120536 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120539 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120542 2580 flags.go:64] FLAG: --port="10250" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120545 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120548 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02ab94a0b381a4523" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120551 2580 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120554 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120557 2580 flags.go:64] FLAG: --register-node="true" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120575 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120578 2580 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120582 2580 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120585 2580 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120588 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120591 2580 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120595 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120598 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120601 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120604 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120608 2580 flags.go:64] FLAG: --runonce="false" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120610 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120613 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120616 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:20:51.125652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120619 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120622 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120625 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120628 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120631 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120634 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120636 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120641 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120644 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120647 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120651 2580 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120653 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120659 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120661 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120664 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120669 2580 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120671 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120674 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120677 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120680 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120683 2580 flags.go:64] FLAG: --v="2" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120687 2580 flags.go:64] FLAG: --version="false" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120691 2580 flags.go:64] FLAG: --vmodule="" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120696 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.120699 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:20:51.126279 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120789 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120793 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120798 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120801 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120803 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120806 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120808 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120811 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120814 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120816 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120819 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120821 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120823 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120826 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120830 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120833 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120836 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120839 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120841 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120844 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:51.126912 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120846 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120849 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120851 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120854 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120856 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120859 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120862 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120864 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120867 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120870 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120873 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120875 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120878 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120880 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120884 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120887 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120890 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120892 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120895 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120898 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:51.127402 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120900 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120903 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120905 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120908 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120910 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120913 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120919 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120922 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120925 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120927 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120930 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120933 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120935 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120937 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120940 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120942 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120945 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120948 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120950 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120953 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:51.127925 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120955 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120957 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120960 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120963 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120966 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120968 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120972 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120976 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120980 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120983 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120985 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120988 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120991 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120994 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120997 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.120999 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121002 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121005 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121008 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:51.128756 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121011 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121013 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121016 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121018 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121021 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121024 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.121027 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.121589 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.127816 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.127887 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128473 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128481 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128484 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128488 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128491 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:51.129398 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128494 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128498 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128510 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128513 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128516 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128519 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128523 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128526 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128529 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128534 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128539 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128544 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128549 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128552 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128558 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128578 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128581 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128583 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128586 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128589 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:51.129836 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128592 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128594 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128597 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128600 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128603 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128606 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128612 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128625 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128631 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128634 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128637 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128640 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128643 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128646 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128649 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128652 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128655 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128658 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128660 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128663 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:51.130319 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128669 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128672 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128675 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128680 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128685 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128689 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128693 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128698 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128702 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128707 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128711 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128716 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128720 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128729 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128734 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128739 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128743 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128747 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128752 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128756 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:51.130824 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128760 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128763 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128766 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128770 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128775 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128787 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128792 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128796 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128800 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128804 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128809 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128814 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128819 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128824 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128828 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128833 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128838 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128847 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128852 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:51.131299 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128856 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.128861 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.128870 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129814 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129836 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129841 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129845 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129849 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129853 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129856 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129860 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129863 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129865 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129868 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129872 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:51.131820 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129875 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129878 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129882 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129884 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129887 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129890 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129893 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129896 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129899 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129901 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129904 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129909 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129914 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129918 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129922 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129926 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129929 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129932 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129935 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:51.132198 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129938 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129941 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129944 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129947 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129950 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129953 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129955 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129958 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129961 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129963 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129966 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129968 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129972 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129974 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129977 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129980 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129982 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129985 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129988 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129990 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:51.132670 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129993 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129995 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.129998 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130001 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130004 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130006 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130009 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130012 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130014 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130017 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130019 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130022 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130025 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130027 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130030 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130033 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130036 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130039 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130041 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130044 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:51.133169 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130046 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130049 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130051 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130054 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130056 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130059 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130062 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130064 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130067 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130069 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130072 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130074 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130077 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130080 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:51.130082 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.130088 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:51.133658 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.130711 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:20:51.134046 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.132808 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:20:51.134079 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.134046 2580 server.go:1019] "Starting client certificate rotation" Apr 22 18:20:51.134156 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.134138 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:51.134187 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.134180 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:51.155908 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.155890 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:51.159795 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.159780 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:51.175995 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.175971 2580 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:20:51.180987 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.180972 2580 log.go:25] "Validated CRI v1 image API" Apr 22 18:20:51.182116 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.182100 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:20:51.188459 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.188436 2580 fs.go:135] Filesystem UUIDs: map[1580fb9a-b887-4342-b1a8-e8d1b5b3a136:/dev/nvme0n1p4 26c60c45-75ff-4b80-abb0-cbbc003a4833:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:20:51.188541 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.188460 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:20:51.188541 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.188513 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:51.193245 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.193108 2580 manager.go:217] Machine: {Timestamp:2026-04-22 18:20:51.192090223 +0000 UTC m=+0.370393633 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3081503 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f2935e376cbfae1bf19d0ffd3f6b1 SystemUUID:ec2f2935-e376-cbfa-e1bf-19d0ffd3f6b1 BootID:6d426c2a-3f2c-4b1f-9fe5-b76e4912940a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:be:71:b5:8a:f3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:be:71:b5:8a:f3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:5b:72:73:82:af Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:20:51.193245 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.193234 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:20:51.193408 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.193332 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:20:51.193710 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.193687 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:20:51.193877 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.193710 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-5.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:20:51.193964 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.193890 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:20:51.193964 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.193903 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:20:51.193964 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.193921 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:51.194534 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.194521 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:51.195266 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.195254 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:51.195388 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.195377 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:20:51.197362 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.197351 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:20:51.197442 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.197369 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:20:51.197442 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.197386 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:20:51.197442 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.197400 2580 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:20:51.197442 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.197412 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:20:51.201000 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.200902 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:51.201000 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.200945 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:51.205318 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.205299 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:20:51.206441 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.206426 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:20:51.208015 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.207999 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:20:51.208015 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208016 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208022 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208028 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208034 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208039 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208050 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208055 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208062 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208068 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208077 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:20:51.208092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208085 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:20:51.208827 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208818 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:20:51.208827 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.208827 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:20:51.212224 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.212206 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:20:51.212300 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.212249 2580 server.go:1295] "Started kubelet" Apr 22 18:20:51.212964 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.212911 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:20:51.213045 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.212982 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:20:51.213112 ip-10-0-136-5 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:20:51.213321 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.213292 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:20:51.213479 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.213465 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-5.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:20:51.213823 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.213547 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:20:51.213823 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.213681 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:20:51.214595 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.214555 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:20:51.214736 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.214721 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vdd56" Apr 22 18:20:51.216046 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.216033 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:20:51.218307 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.218289 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:51.218746 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.218727 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:20:51.219478 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.219459 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:20:51.219478 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.219473 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:20:51.219658 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.219539 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:20:51.219658 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.219601 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:20:51.219658 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.219606 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:20:51.219804 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.219761 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:51.220654 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.220636 2580 factory.go:55] Registering systemd factory Apr 22 18:20:51.220723 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.220699 2580 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:20:51.221340 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.221321 2580 factory.go:153] Registering CRI-O factory Apr 22 18:20:51.221340 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.221338 2580 factory.go:223] Registration of the crio container factory successfully Apr 22 18:20:51.221475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.221396 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:20:51.221967 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.221957 2580 factory.go:103] Registering Raw factory Apr 22 18:20:51.222018 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.221974 2580 manager.go:1196] Started watching for new ooms in manager Apr 22 18:20:51.222442 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.222406 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:20:51.222524 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.222477 2580 manager.go:319] Starting recovery of all containers Apr 22 18:20:51.222524 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.222500 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:20:51.222709 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.222689 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:20:51.223431 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.222622 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-5.ec2.internal.18a8c0cfae459999 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-5.ec2.internal,UID:ip-10-0-136-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-5.ec2.internal,},FirstTimestamp:2026-04-22 18:20:51.212220825 +0000 UTC m=+0.390524234,LastTimestamp:2026-04-22 18:20:51.212220825 +0000 UTC m=+0.390524234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-5.ec2.internal,}" Apr 22 18:20:51.223775 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.223757 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vdd56" Apr 22 18:20:51.233511 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.233360 2580 manager.go:324] Recovery completed Apr 22 18:20:51.234635 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.234596 2580 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 18:20:51.237514 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.237502 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:51.241789 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.241772 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:51.241860 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.241805 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:51.241860 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.241820 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:51.242317 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.242304 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:20:51.242317 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.242316 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:20:51.242388 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.242330 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:51.244175 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.244164 2580 policy_none.go:49] "None policy: Start" Apr 22 18:20:51.244222 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.244180 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:20:51.244222 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.244206 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:20:51.245861 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.244490 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-5.ec2.internal.18a8c0cfb008c53e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-5.ec2.internal,UID:ip-10-0-136-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-5.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-5.ec2.internal,},FirstTimestamp:2026-04-22 18:20:51.241788734 +0000 UTC m=+0.420092151,LastTimestamp:2026-04-22 18:20:51.241788734 +0000 UTC m=+0.420092151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-5.ec2.internal,}" Apr 22 18:20:51.285166 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.285152 2580 manager.go:341] "Starting Device Plugin manager" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.285182 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.285195 2580 server.go:85] "Starting device plugin registration server" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.285413 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.285424 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.285520 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.285647 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.285656 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.286082 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:20:51.292440 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.286109 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:51.354163 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.354121 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:20:51.355483 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.355457 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:20:51.355606 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.355495 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:20:51.355606 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.355517 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:20:51.355606 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.355525 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:20:51.355606 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.355577 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:20:51.358125 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.358106 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:51.385736 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.385684 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:51.386708 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.386693 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:51.386791 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.386722 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:51.386791 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.386734 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:51.386791 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.386759 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.399902 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.399882 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.399988 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.399905 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-5.ec2.internal\": node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:51.422799 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.422783 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:51.456317 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.456296 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal"] Apr 22 18:20:51.456369 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.456356 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:51.457107 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.457093 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:51.457168 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.457121 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:51.457168 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.457136 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:51.458151 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458138 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:51.458256 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458243 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.458302 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458269 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:51.458774 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458762 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:51.458774 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458773 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:51.458879 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458781 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:51.458879 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458790 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:51.458879 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458789 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:51.458974 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.458885 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:51.459721 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.459708 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.459778 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.459732 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:51.460297 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.460282 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:51.460391 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.460304 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:51.460391 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.460315 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:51.484221 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.484201 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-5.ec2.internal\" not found" node="ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.488447 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.488432 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-5.ec2.internal\" not found" node="ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.521880 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.521860 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c759d37c693dd42a96a635e33d6e4429-config\") pod \"kube-apiserver-proxy-ip-10-0-136-5.ec2.internal\" (UID: \"c759d37c693dd42a96a635e33d6e4429\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.521950 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.521885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.521950 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.521905 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.522890 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.522876 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:51.622188 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.622163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c759d37c693dd42a96a635e33d6e4429-config\") pod \"kube-apiserver-proxy-ip-10-0-136-5.ec2.internal\" (UID: \"c759d37c693dd42a96a635e33d6e4429\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.622303 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.622193 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.622303 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.622219 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.622303 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.622242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c759d37c693dd42a96a635e33d6e4429-config\") pod \"kube-apiserver-proxy-ip-10-0-136-5.ec2.internal\" (UID: \"c759d37c693dd42a96a635e33d6e4429\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.622303 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.622249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.622303 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.622255 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.623193 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.623180 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:51.723870 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.723796 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:51.786050 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.786009 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.790622 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:51.790605 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:51.824269 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.824238 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:51.924747 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:51.924714 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:52.025223 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:52.025143 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:52.125646 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:52.125619 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:52.133778 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.133759 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:20:52.133920 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.133906 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:20:52.218736 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.218714 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:52.225971 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:52.225951 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:52.226157 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.226120 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:15:51 +0000 UTC" deadline="2027-09-21 10:21:21.570499965 +0000 UTC" Apr 22 18:20:52.226206 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.226160 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12400h0m29.344343688s" Apr 22 18:20:52.236330 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.236304 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:52.303554 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.303480 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gvwxm" Apr 22 18:20:52.315976 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.315955 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gvwxm" Apr 22 18:20:52.326145 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:52.326124 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:52.426407 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:52.426384 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:52.483072 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.483048 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:52.513539 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.513514 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:52.526768 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:52.526749 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:52.532899 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:52.532875 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c41a124527f124f4d00e3a7bbddbf2.slice/crio-b347803ab93380462de912d53585667babc61d7b70272b594e527e33abfa9fac WatchSource:0}: Error finding container b347803ab93380462de912d53585667babc61d7b70272b594e527e33abfa9fac: Status 404 returned error can't find the container with id b347803ab93380462de912d53585667babc61d7b70272b594e527e33abfa9fac Apr 22 18:20:52.536656 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.536643 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:20:52.627159 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:52.627084 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 22 18:20:52.675953 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.675932 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:52.719970 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.719950 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 22 18:20:52.735821 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.735800 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:52.736599 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.736588 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 22 18:20:52.748196 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:52.748179 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:52.809481 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:52.809457 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc759d37c693dd42a96a635e33d6e4429.slice/crio-274bd94a86b2bfc23c1da4d97f06c013b40a88c7d8243d91bcff96313272ebf3 WatchSource:0}: Error finding container 274bd94a86b2bfc23c1da4d97f06c013b40a88c7d8243d91bcff96313272ebf3: Status 404 returned error can't find the container with id 274bd94a86b2bfc23c1da4d97f06c013b40a88c7d8243d91bcff96313272ebf3 Apr 22 18:20:53.198490 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.198448 2580 apiserver.go:52] "Watching apiserver" Apr 22 18:20:53.204380 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.204356 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:20:53.204830 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.204790 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9qx96","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal","openshift-multus/multus-additional-cni-plugins-trrn6","openshift-multus/multus-lffvw","openshift-ovn-kubernetes/ovnkube-node-55s7b","kube-system/konnectivity-agent-6vsjn","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg","openshift-image-registry/node-ca-pvr6x","openshift-multus/network-metrics-daemon-qnx5b","openshift-network-diagnostics/network-check-target-5pf5l","openshift-network-operator/iptables-alerter-wxldw","kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal","openshift-cluster-node-tuning-operator/tuned-s9shc"] Apr 22 18:20:53.206913 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.206881 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.207928 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.207895 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.210111 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.210089 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.210574 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.210536 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:20:53.210691 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.210676 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:20:53.210967 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.210951 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:20:53.211054 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.210979 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:20:53.211150 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.211135 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h9dx4\"" Apr 22 18:20:53.211641 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.211230 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.212798 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.212778 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:20:53.213370 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.213352 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:20:53.213512 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.213498 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:20:53.214030 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.214010 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:20:53.214876 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.214857 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:53.214973 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.214917 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:20:53.215118 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.215101 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:20:53.215202 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.215137 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.215382 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.215365 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d6wz6\"" Apr 22 18:20:53.217890 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.217869 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:53.217970 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.217929 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.218524 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.218866 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.219100 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.219138 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-48pbs\"" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.219791 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.219941 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wktdg\"" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.220031 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.220272 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.220323 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.220652 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.221422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.221070 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:20:53.222959 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.222255 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:20:53.222959 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.222371 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:20:53.222959 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.222708 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:20:53.222959 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.222708 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:20:53.225717 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.225698 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:20:53.226863 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.226846 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:53.227240 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.227223 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:53.231857 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.231826 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cd65k\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.234857 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r6fxt\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.235791 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.236283 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qk7fh\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.236750 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.236838 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.237212 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-297ng\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.237473 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.237816 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:20:53.238251 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.237993 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:53.238667 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.238406 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.239954 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysconfig\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.239993 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/740b9fc5-590e-4e08-9353-98530ff6a51d-tmp\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240028 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240057 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-socket-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240139 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240179 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdqgp\" (UniqueName: \"kubernetes.io/projected/df0eb815-7f84-4620-bd4e-5699eeedbf3f-kube-api-access-cdqgp\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240311 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-system-cni-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240347 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-conf-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240376 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-sys\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-registration-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240433 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240464 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240494 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69331e75-5dda-4eb0-bab5-92289e6d91b9-host-slash\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-cni-bin\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240547 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-systemd-units\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.240838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240598 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-var-lib-kubelet\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.241394 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxxm\" (UniqueName: \"kubernetes.io/projected/740b9fc5-590e-4e08-9353-98530ff6a51d-kube-api-access-sbxxm\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.241394 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240669 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-socket-dir-parent\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.241394 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240724 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-systemd\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.241394 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240765 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/69331e75-5dda-4eb0-bab5-92289e6d91b9-iptables-alerter-script\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.241394 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.240797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysctl-d\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.242253 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.242240 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vmtz5\"" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243549 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-etc-selinux\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243609 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xmf\" (UniqueName: \"kubernetes.io/projected/4f9336b9-d229-4225-bb2f-5e9509a44cdf-kube-api-access-l6xmf\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243636 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-slash\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243658 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-ovn\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243682 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovn-node-metrics-cert\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243709 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvdv\" (UniqueName: \"kubernetes.io/projected/69331e75-5dda-4eb0-bab5-92289e6d91b9-kube-api-access-6tvdv\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-kubelet\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243754 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-cni-multus\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/df0eb815-7f84-4620-bd4e-5699eeedbf3f-hosts-file\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243803 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-cnibin\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243827 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243850 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-k8s-cni-cncf-io\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-daemon-config\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243909 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-etc-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243931 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-node-log\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.243954 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysctl-conf\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244051 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/df0eb815-7f84-4620-bd4e-5699eeedbf3f-tmp-dir\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244107 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-var-lib-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244135 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-run-ovn-kubernetes\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244176 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-cni-netd\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244203 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244245 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-os-release\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244271 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02345ce1-bba1-425d-a97e-9bacb8d17a7d-serviceca\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244295 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f9336b9-d229-4225-bb2f-5e9509a44cdf-cni-binary-copy\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244337 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-kubelet\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-log-socket\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovnkube-config\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovnkube-script-lib\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7hs\" (UniqueName: \"kubernetes.io/projected/a537ebfe-6f3d-459f-9bed-0c90b065a507-kube-api-access-qf7hs\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244472 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-run-netns\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244510 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.244918 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7bcf606c-7bd7-4d2b-841c-952f4528ebad-agent-certs\") pod \"konnectivity-agent-6vsjn\" (UID: \"7bcf606c-7bd7-4d2b-841c-952f4528ebad\") " pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244576 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7bcf606c-7bd7-4d2b-841c-952f4528ebad-konnectivity-ca\") pod \"konnectivity-agent-6vsjn\" (UID: \"7bcf606c-7bd7-4d2b-841c-952f4528ebad\") " pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244605 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntpf6\" (UniqueName: \"kubernetes.io/projected/f8090e4a-c63d-4fe0-a159-0e9452335122-kube-api-access-ntpf6\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244646 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-hostroot\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244669 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244695 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-system-cni-dir\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244720 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244744 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-modprobe-d\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-systemd\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244789 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-sys-fs\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd2fz\" (UniqueName: \"kubernetes.io/projected/218a3414-b342-4f96-b216-476f4ac84b2d-kube-api-access-wd2fz\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244913 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-cni-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244938 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-etc-kubernetes\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244969 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-env-overrides\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.244994 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gwp\" (UniqueName: \"kubernetes.io/projected/7492c87e-50ec-4adc-9d4d-671ecf7e2492-kube-api-access-27gwp\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245018 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-kubernetes\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.245584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245040 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-run\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245062 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-lib-modules\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245084 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-host\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245147 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-tuned\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245175 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-os-release\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245200 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-netns\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245222 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-cni-bin\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-device-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245342 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02345ce1-bba1-425d-a97e-9bacb8d17a7d-host\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245413 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcb97\" (UniqueName: \"kubernetes.io/projected/02345ce1-bba1-425d-a97e-9bacb8d17a7d-kube-api-access-dcb97\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245437 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-cnibin\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.246112 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.245501 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-multus-certs\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.317747 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.317709 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:52 +0000 UTC" deadline="2027-11-27 11:58:43.673039435 +0000 UTC" Apr 22 18:20:53.317747 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.317744 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14009h37m50.355298843s" Apr 22 18:20:53.344141 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.344115 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:53.345998 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.345966 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-lib-modules\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.346122 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346004 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-host\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.346122 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346026 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-tuned\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.346122 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346049 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-os-release\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.346122 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346073 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-netns\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.346122 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346097 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-cni-bin\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.346122 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346109 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-host\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.346122 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346122 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-device-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346146 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02345ce1-bba1-425d-a97e-9bacb8d17a7d-host\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346171 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcb97\" (UniqueName: \"kubernetes.io/projected/02345ce1-bba1-425d-a97e-9bacb8d17a7d-kube-api-access-dcb97\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346184 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-lib-modules\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346196 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-cnibin\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-multus-certs\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346253 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-cnibin\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346268 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysconfig\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346294 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/740b9fc5-590e-4e08-9353-98530ff6a51d-tmp\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346320 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346325 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-cni-bin\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346360 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-socket-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346372 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-device-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346389 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346396 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346419 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdqgp\" (UniqueName: \"kubernetes.io/projected/df0eb815-7f84-4620-bd4e-5699eeedbf3f-kube-api-access-cdqgp\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346444 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-system-cni-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346453 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-multus-certs\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.346475 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346465 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-os-release\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346470 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-conf-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346295 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-netns\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346420 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346519 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-conf-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346527 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02345ce1-bba1-425d-a97e-9bacb8d17a7d-host\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346546 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-sys\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346548 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-socket-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346594 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-registration-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346622 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-system-cni-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346650 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69331e75-5dda-4eb0-bab5-92289e6d91b9-host-slash\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346704 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-cni-bin\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346730 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-systemd-units\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346763 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-var-lib-kubelet\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346772 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.347390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxxm\" (UniqueName: \"kubernetes.io/projected/740b9fc5-590e-4e08-9353-98530ff6a51d-kube-api-access-sbxxm\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346814 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-socket-dir-parent\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346874 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-socket-dir-parent\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346896 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-sys\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346894 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69331e75-5dda-4eb0-bab5-92289e6d91b9-host-slash\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346417 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysconfig\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-cni-bin\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346930 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-var-lib-kubelet\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346943 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-systemd-units\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.346973 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-registration-dir\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347002 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-systemd\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/69331e75-5dda-4eb0-bab5-92289e6d91b9-iptables-alerter-script\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347044 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysctl-d\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347063 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-etc-selinux\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347085 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xmf\" (UniqueName: \"kubernetes.io/projected/4f9336b9-d229-4225-bb2f-5e9509a44cdf-kube-api-access-l6xmf\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-slash\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347156 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-ovn\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.348191 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347180 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovn-node-metrics-cert\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347203 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvdv\" (UniqueName: \"kubernetes.io/projected/69331e75-5dda-4eb0-bab5-92289e6d91b9-kube-api-access-6tvdv\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-kubelet\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysctl-d\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347251 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-cni-multus\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-systemd\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-ovn\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347427 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-etc-selinux\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347424 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/df0eb815-7f84-4620-bd4e-5699eeedbf3f-hosts-file\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-cnibin\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347493 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347517 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-k8s-cni-cncf-io\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-daemon-config\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347578 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-etc-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347600 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-node-log\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349091 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347621 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysctl-conf\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347642 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/df0eb815-7f84-4620-bd4e-5699eeedbf3f-tmp-dir\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347662 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-var-lib-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347687 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-run-ovn-kubernetes\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347700 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-slash\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-cni-netd\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347773 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-kubelet\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347777 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-os-release\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347806 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02345ce1-bba1-425d-a97e-9bacb8d17a7d-serviceca\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347815 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-var-lib-cni-multus\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347836 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f9336b9-d229-4225-bb2f-5e9509a44cdf-cni-binary-copy\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-kubelet\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-log-socket\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347911 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/df0eb815-7f84-4620-bd4e-5699eeedbf3f-hosts-file\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347913 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovnkube-config\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347956 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovnkube-script-lib\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.349936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347984 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7hs\" (UniqueName: \"kubernetes.io/projected/a537ebfe-6f3d-459f-9bed-0c90b065a507-kube-api-access-qf7hs\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.347997 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/69331e75-5dda-4eb0-bab5-92289e6d91b9-iptables-alerter-script\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348012 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-run-netns\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348046 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-run-netns\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348080 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7bcf606c-7bd7-4d2b-841c-952f4528ebad-agent-certs\") pod \"konnectivity-agent-6vsjn\" (UID: \"7bcf606c-7bd7-4d2b-841c-952f4528ebad\") " pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348082 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-cnibin\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348111 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7bcf606c-7bd7-4d2b-841c-952f4528ebad-konnectivity-ca\") pod \"konnectivity-agent-6vsjn\" (UID: \"7bcf606c-7bd7-4d2b-841c-952f4528ebad\") " pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348142 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntpf6\" (UniqueName: \"kubernetes.io/projected/f8090e4a-c63d-4fe0-a159-0e9452335122-kube-api-access-ntpf6\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-hostroot\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348196 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348225 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-system-cni-dir\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348256 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348281 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-modprobe-d\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-systemd\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348330 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-sys-fs\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348354 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd2fz\" (UniqueName: \"kubernetes.io/projected/218a3414-b342-4f96-b216-476f4ac84b2d-kube-api-access-wd2fz\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.350735 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-cni-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-etc-kubernetes\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348433 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-env-overrides\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348432 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovnkube-config\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348455 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27gwp\" (UniqueName: \"kubernetes.io/projected/7492c87e-50ec-4adc-9d4d-671ecf7e2492-kube-api-access-27gwp\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348480 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-kubernetes\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348500 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-node-log\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348504 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-run\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348554 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-run\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348591 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/218a3414-b342-4f96-b216-476f4ac84b2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348620 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-var-lib-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348659 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-run-ovn-kubernetes\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348657 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-host-run-k8s-cni-cncf-io\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348684 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-sysctl-conf\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.348737 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-os-release\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.349142 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovnkube-script-lib\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.349200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-run-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.349286 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02345ce1-bba1-425d-a97e-9bacb8d17a7d-serviceca\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.351542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.349418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-cni-netd\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.349538 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-kubelet\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.349635 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a537ebfe-6f3d-459f-9bed-0c90b065a507-sys-fs\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.349668 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-log-socket\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.349740 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/218a3414-b342-4f96-b216-476f4ac84b2d-system-cni-dir\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350174 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-hostroot\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.350271 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.350373 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:53.85035155 +0000 UTC m=+3.028654971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350714 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7492c87e-50ec-4adc-9d4d-671ecf7e2492-env-overrides\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350741 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350799 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-cni-dir\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350835 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-modprobe-d\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350878 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f9336b9-d229-4225-bb2f-5e9509a44cdf-cni-binary-copy\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350882 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-systemd\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350957 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9336b9-d229-4225-bb2f-5e9509a44cdf-etc-kubernetes\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.350997 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7492c87e-50ec-4adc-9d4d-671ecf7e2492-etc-openvswitch\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.351048 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-kubernetes\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.351230 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f9336b9-d229-4225-bb2f-5e9509a44cdf-multus-daemon-config\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.352375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.351302 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/df0eb815-7f84-4620-bd4e-5699eeedbf3f-tmp-dir\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.353299 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.351412 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7bcf606c-7bd7-4d2b-841c-952f4528ebad-konnectivity-ca\") pod \"konnectivity-agent-6vsjn\" (UID: \"7bcf606c-7bd7-4d2b-841c-952f4528ebad\") " pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:20:53.353299 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.351755 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/740b9fc5-590e-4e08-9353-98530ff6a51d-tmp\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.353299 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.351832 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7492c87e-50ec-4adc-9d4d-671ecf7e2492-ovn-node-metrics-cert\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.353299 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.352203 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/740b9fc5-590e-4e08-9353-98530ff6a51d-etc-tuned\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.353936 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.353913 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7bcf606c-7bd7-4d2b-841c-952f4528ebad-agent-certs\") pod \"konnectivity-agent-6vsjn\" (UID: \"7bcf606c-7bd7-4d2b-841c-952f4528ebad\") " pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:20:53.362237 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.362166 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" event={"ID":"e2c41a124527f124f4d00e3a7bbddbf2","Type":"ContainerStarted","Data":"b347803ab93380462de912d53585667babc61d7b70272b594e527e33abfa9fac"} Apr 22 18:20:53.364089 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.364065 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" event={"ID":"c759d37c693dd42a96a635e33d6e4429","Type":"ContainerStarted","Data":"274bd94a86b2bfc23c1da4d97f06c013b40a88c7d8243d91bcff96313272ebf3"} Apr 22 18:20:53.368910 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.368892 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:53.369025 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.368915 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:53.369025 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.368965 2580 projected.go:194] Error preparing data for projected volume kube-api-access-5gzzz for pod openshift-network-diagnostics/network-check-target-5pf5l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:53.369150 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.369028 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz podName:8e4c591e-67eb-4346-8dee-d861e58cc4a0 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:53.869009861 +0000 UTC m=+3.047313258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5gzzz" (UniqueName: "kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz") pod "network-check-target-5pf5l" (UID: "8e4c591e-67eb-4346-8dee-d861e58cc4a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:53.371926 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.371899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gwp\" (UniqueName: \"kubernetes.io/projected/7492c87e-50ec-4adc-9d4d-671ecf7e2492-kube-api-access-27gwp\") pod \"ovnkube-node-55s7b\" (UID: \"7492c87e-50ec-4adc-9d4d-671ecf7e2492\") " pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.375672 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.375644 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvdv\" (UniqueName: \"kubernetes.io/projected/69331e75-5dda-4eb0-bab5-92289e6d91b9-kube-api-access-6tvdv\") pod \"iptables-alerter-wxldw\" (UID: \"69331e75-5dda-4eb0-bab5-92289e6d91b9\") " pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.378665 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.378482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd2fz\" (UniqueName: \"kubernetes.io/projected/218a3414-b342-4f96-b216-476f4ac84b2d-kube-api-access-wd2fz\") pod \"multus-additional-cni-plugins-trrn6\" (UID: \"218a3414-b342-4f96-b216-476f4ac84b2d\") " pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.379775 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.379664 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcb97\" (UniqueName: \"kubernetes.io/projected/02345ce1-bba1-425d-a97e-9bacb8d17a7d-kube-api-access-dcb97\") pod \"node-ca-pvr6x\" (UID: \"02345ce1-bba1-425d-a97e-9bacb8d17a7d\") " pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.383057 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.383037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntpf6\" (UniqueName: \"kubernetes.io/projected/f8090e4a-c63d-4fe0-a159-0e9452335122-kube-api-access-ntpf6\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:53.383151 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.383038 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xmf\" (UniqueName: \"kubernetes.io/projected/4f9336b9-d229-4225-bb2f-5e9509a44cdf-kube-api-access-l6xmf\") pod \"multus-lffvw\" (UID: \"4f9336b9-d229-4225-bb2f-5e9509a44cdf\") " pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.383966 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.383941 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7hs\" (UniqueName: \"kubernetes.io/projected/a537ebfe-6f3d-459f-9bed-0c90b065a507-kube-api-access-qf7hs\") pod \"aws-ebs-csi-driver-node-j6xkg\" (UID: \"a537ebfe-6f3d-459f-9bed-0c90b065a507\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.384476 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.384430 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdqgp\" (UniqueName: \"kubernetes.io/projected/df0eb815-7f84-4620-bd4e-5699eeedbf3f-kube-api-access-cdqgp\") pod \"node-resolver-9qx96\" (UID: \"df0eb815-7f84-4620-bd4e-5699eeedbf3f\") " pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.384707 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.384685 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxxm\" (UniqueName: \"kubernetes.io/projected/740b9fc5-590e-4e08-9353-98530ff6a51d-kube-api-access-sbxxm\") pod \"tuned-s9shc\" (UID: \"740b9fc5-590e-4e08-9353-98530ff6a51d\") " pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.522140 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.522061 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" Apr 22 18:20:53.534050 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.534025 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-trrn6" Apr 22 18:20:53.548905 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.548874 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lffvw" Apr 22 18:20:53.557656 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.557632 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:20:53.568328 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.568309 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9qx96" Apr 22 18:20:53.569829 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.569812 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:20:53.575303 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.575283 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pvr6x" Apr 22 18:20:53.582836 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.582821 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wxldw" Apr 22 18:20:53.590353 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.590337 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s9shc" Apr 22 18:20:53.851104 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.851012 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:53.851246 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.851158 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:53.851246 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.851232 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:54.851216581 +0000 UTC m=+4.029519995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:53.951818 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:53.951784 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:53.951996 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.951958 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:53.951996 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.951981 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:53.951996 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.951997 2580 projected.go:194] Error preparing data for projected volume kube-api-access-5gzzz for pod openshift-network-diagnostics/network-check-target-5pf5l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:53.952117 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:53.952057 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz podName:8e4c591e-67eb-4346-8dee-d861e58cc4a0 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:54.95203852 +0000 UTC m=+4.130341935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5gzzz" (UniqueName: "kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz") pod "network-check-target-5pf5l" (UID: "8e4c591e-67eb-4346-8dee-d861e58cc4a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:54.287016 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:54.286951 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod740b9fc5_590e_4e08_9353_98530ff6a51d.slice/crio-be9d3b6c8be57b74b1d8f74aee20322fa18b7b00790e91a37722c0497038a88c WatchSource:0}: Error finding container be9d3b6c8be57b74b1d8f74aee20322fa18b7b00790e91a37722c0497038a88c: Status 404 returned error can't find the container with id be9d3b6c8be57b74b1d8f74aee20322fa18b7b00790e91a37722c0497038a88c Apr 22 18:20:54.290734 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:54.290713 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bcf606c_7bd7_4d2b_841c_952f4528ebad.slice/crio-ce32bbcd67d8571744fffb0825a34f9b328ff3407b5fef9872618a041c822cff WatchSource:0}: Error finding container ce32bbcd67d8571744fffb0825a34f9b328ff3407b5fef9872618a041c822cff: Status 404 returned error can't find the container with id ce32bbcd67d8571744fffb0825a34f9b328ff3407b5fef9872618a041c822cff Apr 22 18:20:54.295073 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:54.295026 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7492c87e_50ec_4adc_9d4d_671ecf7e2492.slice/crio-be8de9cdf110e7e0720b703ac4456711e2108d8833a269a73ddfba31187d33a4 WatchSource:0}: Error finding container be8de9cdf110e7e0720b703ac4456711e2108d8833a269a73ddfba31187d33a4: Status 404 returned error can't find the container with id be8de9cdf110e7e0720b703ac4456711e2108d8833a269a73ddfba31187d33a4 Apr 22 18:20:54.295869 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:20:54.295806 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02345ce1_bba1_425d_a97e_9bacb8d17a7d.slice/crio-cb4e868170178e2ca94850317ae03fa286d7caf06fbcd9652a2919464603e8b2 WatchSource:0}: Error finding container cb4e868170178e2ca94850317ae03fa286d7caf06fbcd9652a2919464603e8b2: Status 404 returned error can't find the container with id cb4e868170178e2ca94850317ae03fa286d7caf06fbcd9652a2919464603e8b2 Apr 22 18:20:54.318756 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.318616 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:52 +0000 UTC" deadline="2028-02-01 03:59:59.742034482 +0000 UTC" Apr 22 18:20:54.318756 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.318756 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15585h39m5.423281393s" Apr 22 18:20:54.366712 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.366679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pvr6x" event={"ID":"02345ce1-bba1-425d-a97e-9bacb8d17a7d","Type":"ContainerStarted","Data":"cb4e868170178e2ca94850317ae03fa286d7caf06fbcd9652a2919464603e8b2"} Apr 22 18:20:54.367587 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.367543 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"be8de9cdf110e7e0720b703ac4456711e2108d8833a269a73ddfba31187d33a4"} Apr 22 18:20:54.368385 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.368363 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wxldw" event={"ID":"69331e75-5dda-4eb0-bab5-92289e6d91b9","Type":"ContainerStarted","Data":"a7049445a1203d05c4142a754837940294bc95a9ff4fa4b7d4202d702879afad"} Apr 22 18:20:54.369396 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.369378 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-trrn6" event={"ID":"218a3414-b342-4f96-b216-476f4ac84b2d","Type":"ContainerStarted","Data":"33d011cf56245e4a82bb78c47f1d649eeba4e79bdaa431ffbea2f1cccb40c5f0"} Apr 22 18:20:54.370497 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.370476 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lffvw" event={"ID":"4f9336b9-d229-4225-bb2f-5e9509a44cdf","Type":"ContainerStarted","Data":"cdbd027764fed46705edf84a9e7e2e4fb798000603190f42e681519cede5b201"} Apr 22 18:20:54.371499 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.371482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6vsjn" event={"ID":"7bcf606c-7bd7-4d2b-841c-952f4528ebad","Type":"ContainerStarted","Data":"ce32bbcd67d8571744fffb0825a34f9b328ff3407b5fef9872618a041c822cff"} Apr 22 18:20:54.372375 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.372356 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s9shc" event={"ID":"740b9fc5-590e-4e08-9353-98530ff6a51d","Type":"ContainerStarted","Data":"be9d3b6c8be57b74b1d8f74aee20322fa18b7b00790e91a37722c0497038a88c"} Apr 22 18:20:54.373296 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.373264 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" event={"ID":"a537ebfe-6f3d-459f-9bed-0c90b065a507","Type":"ContainerStarted","Data":"96e674c99dfe77fc5f98e492a1e740855ca97232ca2b657afba31f8478caa83a"} Apr 22 18:20:54.374171 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.374151 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9qx96" event={"ID":"df0eb815-7f84-4620-bd4e-5699eeedbf3f","Type":"ContainerStarted","Data":"367e86d01edef9be05f3ddc77679f5a791e282732d6455f4ddff1a42c0277d1c"} Apr 22 18:20:54.858676 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.858635 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:54.858881 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:54.858793 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:54.858881 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:54.858860 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:56.858839933 +0000 UTC m=+6.037143347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:54.960038 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:54.959997 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:54.960215 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:54.960175 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:54.960215 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:54.960193 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:54.960215 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:54.960206 2580 projected.go:194] Error preparing data for projected volume kube-api-access-5gzzz for pod openshift-network-diagnostics/network-check-target-5pf5l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:54.960371 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:54.960263 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz podName:8e4c591e-67eb-4346-8dee-d861e58cc4a0 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:56.960245218 +0000 UTC m=+6.138548623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5gzzz" (UniqueName: "kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz") pod "network-check-target-5pf5l" (UID: "8e4c591e-67eb-4346-8dee-d861e58cc4a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:55.356049 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:55.356015 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:55.356502 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:55.356146 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:20:55.356620 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:55.356601 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:55.356729 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:55.356709 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:20:55.394684 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:55.394003 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" event={"ID":"c759d37c693dd42a96a635e33d6e4429","Type":"ContainerStarted","Data":"6ce57f0c8b3b6786b31cd3b5b9cbaf34ccb2c5a26a8dc406f4e8313e6a6a0189"} Apr 22 18:20:55.406618 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:55.405609 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" podStartSLOduration=3.405583143 podStartE2EDuration="3.405583143s" podCreationTimestamp="2026-04-22 18:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:20:55.405519727 +0000 UTC m=+4.583823148" watchObservedRunningTime="2026-04-22 18:20:55.405583143 +0000 UTC m=+4.583886557" Apr 22 18:20:55.406804 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:55.406677 2580 generic.go:358] "Generic (PLEG): container finished" podID="e2c41a124527f124f4d00e3a7bbddbf2" containerID="ea25fd5b7cbc468428063bf4272ef0d86b76d257af6c8d3e263a33d232e71c75" exitCode=0 Apr 22 18:20:55.406804 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:55.406728 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" event={"ID":"e2c41a124527f124f4d00e3a7bbddbf2","Type":"ContainerDied","Data":"ea25fd5b7cbc468428063bf4272ef0d86b76d257af6c8d3e263a33d232e71c75"} Apr 22 18:20:56.417341 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:56.417265 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" event={"ID":"e2c41a124527f124f4d00e3a7bbddbf2","Type":"ContainerStarted","Data":"64e424ad31dd2ffdc269d79300b88d480a08168949d35b2b8e7eec272cb9abdc"} Apr 22 18:20:56.879584 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:56.879489 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:56.879742 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:56.879657 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:56.879742 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:56.879722 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:00.879702342 +0000 UTC m=+10.058005741 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:56.980173 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:56.980116 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:56.980355 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:56.980312 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:56.980355 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:56.980331 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:56.980355 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:56.980343 2580 projected.go:194] Error preparing data for projected volume kube-api-access-5gzzz for pod openshift-network-diagnostics/network-check-target-5pf5l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:56.980513 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:56.980401 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz podName:8e4c591e-67eb-4346-8dee-d861e58cc4a0 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:00.980381525 +0000 UTC m=+10.158684938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5gzzz" (UniqueName: "kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz") pod "network-check-target-5pf5l" (UID: "8e4c591e-67eb-4346-8dee-d861e58cc4a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:57.356186 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:57.356102 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:57.356352 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:57.356252 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:20:57.356771 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:57.356740 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:57.356872 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:57.356850 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:20:59.356137 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:59.356101 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:20:59.356603 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:59.356221 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:20:59.356687 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:20:59.356657 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:20:59.356811 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:20:59.356786 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:00.912282 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:00.912240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:00.912789 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:00.912402 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:00.912789 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:00.912465 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:08.912447102 +0000 UTC m=+18.090750521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:01.012662 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:01.012620 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:01.012816 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:01.012798 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:21:01.012858 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:01.012823 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:21:01.012858 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:01.012836 2580 projected.go:194] Error preparing data for projected volume kube-api-access-5gzzz for pod openshift-network-diagnostics/network-check-target-5pf5l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:01.012933 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:01.012904 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz podName:8e4c591e-67eb-4346-8dee-d861e58cc4a0 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:09.012883531 +0000 UTC m=+18.191186933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5gzzz" (UniqueName: "kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz") pod "network-check-target-5pf5l" (UID: "8e4c591e-67eb-4346-8dee-d861e58cc4a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:01.361682 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:01.361603 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:01.361846 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:01.361733 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:01.362437 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:01.362164 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:01.363344 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:01.363288 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:03.355724 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:03.355694 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:03.356140 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:03.355838 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:03.356140 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:03.355894 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:03.356140 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:03.356001 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:05.358249 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:05.358214 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:05.358687 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:05.358222 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:05.358687 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:05.358337 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:05.358687 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:05.358417 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:07.355835 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:07.355802 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:07.356258 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:07.355942 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:07.356335 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:07.356314 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:07.356436 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:07.356417 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:08.971902 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:08.971865 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:08.972347 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:08.972032 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:08.972347 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:08.972102 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:24.972085206 +0000 UTC m=+34.150388605 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:09.072372 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:09.072339 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:09.072552 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:09.072530 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:21:09.072625 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:09.072580 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:21:09.072625 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:09.072594 2580 projected.go:194] Error preparing data for projected volume kube-api-access-5gzzz for pod openshift-network-diagnostics/network-check-target-5pf5l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:09.072726 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:09.072646 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz podName:8e4c591e-67eb-4346-8dee-d861e58cc4a0 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:25.072633519 +0000 UTC m=+34.250936920 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5gzzz" (UniqueName: "kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz") pod "network-check-target-5pf5l" (UID: "8e4c591e-67eb-4346-8dee-d861e58cc4a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:09.356282 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:09.356206 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:09.356450 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:09.356332 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:09.356450 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:09.356381 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:09.356450 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:09.356446 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:11.356352 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:11.356316 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:11.356707 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:11.356399 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:11.356707 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:11.356492 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:11.356707 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:11.356611 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:12.446215 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.445790 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s9shc" event={"ID":"740b9fc5-590e-4e08-9353-98530ff6a51d","Type":"ContainerStarted","Data":"8c18d9bb0e0a8b0dee53781c71995d1dd2080db58ff7ec5fa69fe841b2e1c0cd"} Apr 22 18:21:12.447124 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.447103 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" event={"ID":"a537ebfe-6f3d-459f-9bed-0c90b065a507","Type":"ContainerStarted","Data":"9e4d556aa473addc7b4fcd518bcfa1f7a5388afa9ea0caac4d0fff2e1720b2f2"} Apr 22 18:21:12.448446 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.448423 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9qx96" event={"ID":"df0eb815-7f84-4620-bd4e-5699eeedbf3f","Type":"ContainerStarted","Data":"87540638e7ac7980012890d3e738d78b59325a1ad6addaf1e49990d5defa886c"} Apr 22 18:21:12.451741 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.451720 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pvr6x" event={"ID":"02345ce1-bba1-425d-a97e-9bacb8d17a7d","Type":"ContainerStarted","Data":"4f523f98d13a1d6e53f519d67b2fc03593a24ad4dd0da98523012a820c427f47"} Apr 22 18:21:12.453939 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.453922 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"867784f767a6eec98b2d69532e61caacc497fd1993326a48e04262203ba4052d"} Apr 22 18:21:12.454044 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.453947 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"6c9d39b67095ef3b9048e4b28638f99cae83c63049886b58c3e1da9bb001d97c"} Apr 22 18:21:12.454044 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.453962 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"e951c17573d2a67e8825fdbcb3e86b7e57483fcc4754090ddcaca006d4a97c93"} Apr 22 18:21:12.454044 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.453974 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"e0801d1227a99a8bcbff461b0e6c5ee0e39a444e82b7def487bd2482688a2b49"} Apr 22 18:21:12.454044 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.453982 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"4611c9cb5f8a42a99a719b56cfb125893d7eeee043d13dbaa94050730a239f7c"} Apr 22 18:21:12.454044 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.453991 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"399173dc9d9370e5b65054111783f00a4ccadd659c06cf62b8b43e1c71d77814"} Apr 22 18:21:12.455033 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.455013 2580 generic.go:358] "Generic (PLEG): container finished" podID="218a3414-b342-4f96-b216-476f4ac84b2d" containerID="7368b0a0d5e61e11e3ec6c463a5b061b7fdea8b75fd517daeb3abf1e6ff21554" exitCode=0 Apr 22 18:21:12.455121 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.455064 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-trrn6" event={"ID":"218a3414-b342-4f96-b216-476f4ac84b2d","Type":"ContainerDied","Data":"7368b0a0d5e61e11e3ec6c463a5b061b7fdea8b75fd517daeb3abf1e6ff21554"} Apr 22 18:21:12.456282 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.456269 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lffvw" event={"ID":"4f9336b9-d229-4225-bb2f-5e9509a44cdf","Type":"ContainerStarted","Data":"31427117d5733e20e9ebbcddcddb0f7f2ae7a839da87a6c98d1c9e1636de1e1e"} Apr 22 18:21:12.457452 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.457376 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6vsjn" event={"ID":"7bcf606c-7bd7-4d2b-841c-952f4528ebad","Type":"ContainerStarted","Data":"2249e65388ec8a28d963271e4ed705d81e482b0f55737f601972783ea15cabea"} Apr 22 18:21:12.468165 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.468121 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s9shc" podStartSLOduration=4.247561455 podStartE2EDuration="21.468108932s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.289399815 +0000 UTC m=+3.467703217" lastFinishedPulling="2026-04-22 18:21:11.509947297 +0000 UTC m=+20.688250694" observedRunningTime="2026-04-22 18:21:12.467906546 +0000 UTC m=+21.646209964" watchObservedRunningTime="2026-04-22 18:21:12.468108932 +0000 UTC m=+21.646412353" Apr 22 18:21:12.468317 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.468297 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" podStartSLOduration=20.468290016 podStartE2EDuration="20.468290016s" podCreationTimestamp="2026-04-22 18:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:20:56.43067066 +0000 UTC m=+5.608974078" watchObservedRunningTime="2026-04-22 18:21:12.468290016 +0000 UTC m=+21.646593437" Apr 22 18:21:12.517048 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.517004 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6vsjn" podStartSLOduration=4.302970398 podStartE2EDuration="21.516989976s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.295880896 +0000 UTC m=+3.474184304" lastFinishedPulling="2026-04-22 18:21:11.509900474 +0000 UTC m=+20.688203882" observedRunningTime="2026-04-22 18:21:12.489231597 +0000 UTC m=+21.667535016" watchObservedRunningTime="2026-04-22 18:21:12.516989976 +0000 UTC m=+21.695293396" Apr 22 18:21:12.537392 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.537357 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lffvw" podStartSLOduration=4.126456697 podStartE2EDuration="21.537347041s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.301465801 +0000 UTC m=+3.479769202" lastFinishedPulling="2026-04-22 18:21:11.712356142 +0000 UTC m=+20.890659546" observedRunningTime="2026-04-22 18:21:12.517137011 +0000 UTC m=+21.695440419" watchObservedRunningTime="2026-04-22 18:21:12.537347041 +0000 UTC m=+21.715650459" Apr 22 18:21:12.556992 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.556964 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9qx96" podStartSLOduration=4.147512408 podStartE2EDuration="21.556954455s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.302719956 +0000 UTC m=+3.481023354" lastFinishedPulling="2026-04-22 18:21:11.712161989 +0000 UTC m=+20.890465401" observedRunningTime="2026-04-22 18:21:12.537703066 +0000 UTC m=+21.716006486" watchObservedRunningTime="2026-04-22 18:21:12.556954455 +0000 UTC m=+21.735257894" Apr 22 18:21:12.584599 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:12.584543 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pvr6x" podStartSLOduration=4.372031715 podStartE2EDuration="21.584531724s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.297474797 +0000 UTC m=+3.475778197" lastFinishedPulling="2026-04-22 18:21:11.509974809 +0000 UTC m=+20.688278206" observedRunningTime="2026-04-22 18:21:12.56107307 +0000 UTC m=+21.739376488" watchObservedRunningTime="2026-04-22 18:21:12.584531724 +0000 UTC m=+21.762835122" Apr 22 18:21:13.355938 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:13.355912 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:13.356101 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:13.355952 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:13.356101 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:13.356038 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:13.356211 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:13.356121 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:13.419006 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:13.418975 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:21:13.461432 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:13.461402 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" event={"ID":"a537ebfe-6f3d-459f-9bed-0c90b065a507","Type":"ContainerStarted","Data":"0371512cc070985dc918146ed7892264919b7928c05cb16d522f8d1e4299b1d8"} Apr 22 18:21:13.463452 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:13.463429 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wxldw" event={"ID":"69331e75-5dda-4eb0-bab5-92289e6d91b9","Type":"ContainerStarted","Data":"42bec6c08e4cbba34a072021261795d905ceebea76a029ea7ddeacd5f3203175"} Apr 22 18:21:13.491996 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:13.491953 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wxldw" podStartSLOduration=5.276392453 podStartE2EDuration="22.491937242s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.294556147 +0000 UTC m=+3.472859561" lastFinishedPulling="2026-04-22 18:21:11.51010094 +0000 UTC m=+20.688404350" observedRunningTime="2026-04-22 18:21:13.491696613 +0000 UTC m=+22.670000031" watchObservedRunningTime="2026-04-22 18:21:13.491937242 +0000 UTC m=+22.670240663" Apr 22 18:21:14.299003 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:14.298907 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:21:13.41899679Z","UUID":"43b1671c-c8ac-4069-84f4-d25abf040a05","Handler":null,"Name":"","Endpoint":""} Apr 22 18:21:14.300788 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:14.300767 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:21:14.300899 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:14.300796 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:21:14.466857 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:14.466817 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" event={"ID":"a537ebfe-6f3d-459f-9bed-0c90b065a507","Type":"ContainerStarted","Data":"3a1577e8c71489ae67104862420c1a12b38d1f0bf61e45dd383b259c25a3980c"} Apr 22 18:21:14.470378 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:14.470347 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"ebd26d4d36112ccf777b97b7766dd47abcb3d687e970e250299b64c0a7237fc8"} Apr 22 18:21:15.356316 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:15.356282 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:15.356496 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:15.356413 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:15.356496 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:15.356468 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:15.356646 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:15.356610 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:15.666958 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:15.666927 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:21:15.667815 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:15.667798 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:21:15.684730 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:15.684677 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6xkg" podStartSLOduration=4.760757289 podStartE2EDuration="24.684662935s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.300299226 +0000 UTC m=+3.478602629" lastFinishedPulling="2026-04-22 18:21:14.224204863 +0000 UTC m=+23.402508275" observedRunningTime="2026-04-22 18:21:14.493342307 +0000 UTC m=+23.671645725" watchObservedRunningTime="2026-04-22 18:21:15.684662935 +0000 UTC m=+24.862966353" Apr 22 18:21:16.484974 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:16.484833 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:21:16.485260 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:16.485244 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6vsjn" Apr 22 18:21:17.356040 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.356007 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:17.356040 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.356036 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:17.356809 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:17.356142 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:17.356809 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:17.356251 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:17.477487 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.477451 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" event={"ID":"7492c87e-50ec-4adc-9d4d-671ecf7e2492","Type":"ContainerStarted","Data":"550203cd0f9ef1f5a57cf5fb08dcc87b5fb6cd56d28cf2269cfc211d26a5b781"} Apr 22 18:21:17.477725 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.477710 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:21:17.477804 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.477735 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:21:17.477804 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.477747 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:21:17.479772 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.479744 2580 generic.go:358] "Generic (PLEG): container finished" podID="218a3414-b342-4f96-b216-476f4ac84b2d" containerID="565e136391011b299443b1de6a0f98659f14e532887c7e3f25db2ca50ba260dd" exitCode=0 Apr 22 18:21:17.479883 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.479821 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-trrn6" event={"ID":"218a3414-b342-4f96-b216-476f4ac84b2d","Type":"ContainerDied","Data":"565e136391011b299443b1de6a0f98659f14e532887c7e3f25db2ca50ba260dd"} Apr 22 18:21:17.492981 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.492962 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:21:17.493160 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.493141 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:21:17.503548 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:17.503514 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" podStartSLOduration=9.023638643 podStartE2EDuration="26.503501809s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.296913568 +0000 UTC m=+3.475216971" lastFinishedPulling="2026-04-22 18:21:11.776776727 +0000 UTC m=+20.955080137" observedRunningTime="2026-04-22 18:21:17.503110394 +0000 UTC m=+26.681413813" watchObservedRunningTime="2026-04-22 18:21:17.503501809 +0000 UTC m=+26.681805228" Apr 22 18:21:18.522099 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:18.521931 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qnx5b"] Apr 22 18:21:18.522444 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:18.522198 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:18.522444 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:18.522289 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:18.524082 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:18.524060 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5pf5l"] Apr 22 18:21:18.524163 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:18.524152 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:18.524239 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:18.524225 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:19.484953 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:19.484920 2580 generic.go:358] "Generic (PLEG): container finished" podID="218a3414-b342-4f96-b216-476f4ac84b2d" containerID="6ee75ae57b8f7e9d067380ec8615930723286e3cc33270f977e7b588e80726a8" exitCode=0 Apr 22 18:21:19.485101 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:19.485010 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-trrn6" event={"ID":"218a3414-b342-4f96-b216-476f4ac84b2d","Type":"ContainerDied","Data":"6ee75ae57b8f7e9d067380ec8615930723286e3cc33270f977e7b588e80726a8"} Apr 22 18:21:20.356444 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:20.356412 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:20.356444 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:20.356425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:20.356922 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:20.356517 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:20.356922 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:20.356624 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:21.490397 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:21.490363 2580 generic.go:358] "Generic (PLEG): container finished" podID="218a3414-b342-4f96-b216-476f4ac84b2d" containerID="ef2feeebc0c09ace638d76e5503572c756a8a3cd3be4b02d01e0d15fdde7f56d" exitCode=0 Apr 22 18:21:21.490760 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:21.490423 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-trrn6" event={"ID":"218a3414-b342-4f96-b216-476f4ac84b2d","Type":"ContainerDied","Data":"ef2feeebc0c09ace638d76e5503572c756a8a3cd3be4b02d01e0d15fdde7f56d"} Apr 22 18:21:22.356108 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:22.356075 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:22.356281 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:22.356075 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:22.356281 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:22.356200 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:22.356374 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:22.356295 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:24.356397 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.356223 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:24.356850 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.356229 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:24.356850 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:24.356511 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:21:24.356850 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:24.356603 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5pf5l" podUID="8e4c591e-67eb-4346-8dee-d861e58cc4a0" Apr 22 18:21:24.620390 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.620319 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeReady" Apr 22 18:21:24.620555 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.620467 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:21:24.671128 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.671099 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h9kgh"] Apr 22 18:21:24.688148 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.688119 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wprc6"] Apr 22 18:21:24.704226 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.704202 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h9kgh"] Apr 22 18:21:24.704226 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.704229 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wprc6"] Apr 22 18:21:24.704420 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.704322 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:24.704420 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.704319 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.706659 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.706634 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:21:24.706847 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.706831 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:21:24.707018 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.707004 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:21:24.707186 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.707171 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:21:24.707352 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.707339 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pkrg\"" Apr 22 18:21:24.707721 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.707579 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:21:24.707884 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.707865 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9d4f9\"" Apr 22 18:21:24.791510 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.791466 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.791714 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.791575 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvhz\" (UniqueName: \"kubernetes.io/projected/b7b97893-5155-42ea-878e-ea8aa0dd69d1-kube-api-access-kxvhz\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:24.791714 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.791620 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77xbb\" (UniqueName: \"kubernetes.io/projected/86e8cf17-2f3c-4df9-abce-314dd430b892-kube-api-access-77xbb\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.791714 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.791666 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86e8cf17-2f3c-4df9-abce-314dd430b892-config-volume\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.791854 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.791715 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86e8cf17-2f3c-4df9-abce-314dd430b892-tmp-dir\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.791854 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.791793 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:24.892671 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.892592 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:24.892863 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:24.892670 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:24.892863 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:24.892733 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert podName:b7b97893-5155-42ea-878e-ea8aa0dd69d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:25.392717512 +0000 UTC m=+34.571020909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert") pod "ingress-canary-wprc6" (UID: "b7b97893-5155-42ea-878e-ea8aa0dd69d1") : secret "canary-serving-cert" not found Apr 22 18:21:24.892863 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.892671 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.892863 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:24.892757 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:24.892863 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.892802 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvhz\" (UniqueName: \"kubernetes.io/projected/b7b97893-5155-42ea-878e-ea8aa0dd69d1-kube-api-access-kxvhz\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:24.892863 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:24.892810 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls podName:86e8cf17-2f3c-4df9-abce-314dd430b892 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:25.392795533 +0000 UTC m=+34.571098934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls") pod "dns-default-h9kgh" (UID: "86e8cf17-2f3c-4df9-abce-314dd430b892") : secret "dns-default-metrics-tls" not found Apr 22 18:21:24.892863 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.892850 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77xbb\" (UniqueName: \"kubernetes.io/projected/86e8cf17-2f3c-4df9-abce-314dd430b892-kube-api-access-77xbb\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.893113 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.892886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86e8cf17-2f3c-4df9-abce-314dd430b892-config-volume\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.893113 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.892911 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86e8cf17-2f3c-4df9-abce-314dd430b892-tmp-dir\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.893250 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.893232 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86e8cf17-2f3c-4df9-abce-314dd430b892-tmp-dir\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.893406 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.893390 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86e8cf17-2f3c-4df9-abce-314dd430b892-config-volume\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.905257 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.905220 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77xbb\" (UniqueName: \"kubernetes.io/projected/86e8cf17-2f3c-4df9-abce-314dd430b892-kube-api-access-77xbb\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:24.905481 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.905462 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvhz\" (UniqueName: \"kubernetes.io/projected/b7b97893-5155-42ea-878e-ea8aa0dd69d1-kube-api-access-kxvhz\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:24.993514 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:24.993472 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:24.993703 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:24.993630 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:24.993703 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:24.993687 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:56.993672393 +0000 UTC m=+66.171975789 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:25.094797 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:25.094757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:25.094989 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:25.094923 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:21:25.094989 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:25.094949 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:21:25.094989 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:25.094963 2580 projected.go:194] Error preparing data for projected volume kube-api-access-5gzzz for pod openshift-network-diagnostics/network-check-target-5pf5l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:25.095149 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:25.095030 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz podName:8e4c591e-67eb-4346-8dee-d861e58cc4a0 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:57.095009189 +0000 UTC m=+66.273312594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5gzzz" (UniqueName: "kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz") pod "network-check-target-5pf5l" (UID: "8e4c591e-67eb-4346-8dee-d861e58cc4a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:25.396867 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:25.396829 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:25.397471 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:25.396888 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:25.397471 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:25.396983 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:25.397471 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:25.396983 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:25.397471 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:25.397047 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls podName:86e8cf17-2f3c-4df9-abce-314dd430b892 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:26.397029316 +0000 UTC m=+35.575332713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls") pod "dns-default-h9kgh" (UID: "86e8cf17-2f3c-4df9-abce-314dd430b892") : secret "dns-default-metrics-tls" not found Apr 22 18:21:25.397471 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:25.397061 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert podName:b7b97893-5155-42ea-878e-ea8aa0dd69d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:26.397054575 +0000 UTC m=+35.575357972 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert") pod "ingress-canary-wprc6" (UID: "b7b97893-5155-42ea-878e-ea8aa0dd69d1") : secret "canary-serving-cert" not found Apr 22 18:21:26.356103 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.356062 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:26.356279 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.356070 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:26.358769 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.358744 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:26.359878 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.359847 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fll8t\"" Apr 22 18:21:26.359990 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.359886 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cm497\"" Apr 22 18:21:26.359990 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.359925 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:26.359990 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.359933 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:26.403630 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.403599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:26.404008 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:26.403657 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:26.404008 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:26.403797 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:26.404008 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:26.403855 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls podName:86e8cf17-2f3c-4df9-abce-314dd430b892 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:28.403837035 +0000 UTC m=+37.582140432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls") pod "dns-default-h9kgh" (UID: "86e8cf17-2f3c-4df9-abce-314dd430b892") : secret "dns-default-metrics-tls" not found Apr 22 18:21:26.404274 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:26.404256 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:26.404343 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:26.404303 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert podName:b7b97893-5155-42ea-878e-ea8aa0dd69d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:28.404290206 +0000 UTC m=+37.582593623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert") pod "ingress-canary-wprc6" (UID: "b7b97893-5155-42ea-878e-ea8aa0dd69d1") : secret "canary-serving-cert" not found Apr 22 18:21:28.418421 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:28.418388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:28.419065 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:28.418445 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:28.419065 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:28.418543 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:28.419065 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:28.418583 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:28.419065 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:28.418632 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert podName:b7b97893-5155-42ea-878e-ea8aa0dd69d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:32.418618397 +0000 UTC m=+41.596921794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert") pod "ingress-canary-wprc6" (UID: "b7b97893-5155-42ea-878e-ea8aa0dd69d1") : secret "canary-serving-cert" not found Apr 22 18:21:28.419065 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:28.418646 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls podName:86e8cf17-2f3c-4df9-abce-314dd430b892 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:32.418640339 +0000 UTC m=+41.596943735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls") pod "dns-default-h9kgh" (UID: "86e8cf17-2f3c-4df9-abce-314dd430b892") : secret "dns-default-metrics-tls" not found Apr 22 18:21:28.506271 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:28.506233 2580 generic.go:358] "Generic (PLEG): container finished" podID="218a3414-b342-4f96-b216-476f4ac84b2d" containerID="73d341d6639d6fcdc4fdff550836c3e9ab578b55705c70779c86ccf85108a194" exitCode=0 Apr 22 18:21:28.506445 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:28.506289 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-trrn6" event={"ID":"218a3414-b342-4f96-b216-476f4ac84b2d","Type":"ContainerDied","Data":"73d341d6639d6fcdc4fdff550836c3e9ab578b55705c70779c86ccf85108a194"} Apr 22 18:21:29.510954 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:29.510920 2580 generic.go:358] "Generic (PLEG): container finished" podID="218a3414-b342-4f96-b216-476f4ac84b2d" containerID="689fffdb482ffb2f62343e9ae56db54f8098660a5e745e357459f16e48a3770b" exitCode=0 Apr 22 18:21:29.511344 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:29.510983 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-trrn6" event={"ID":"218a3414-b342-4f96-b216-476f4ac84b2d","Type":"ContainerDied","Data":"689fffdb482ffb2f62343e9ae56db54f8098660a5e745e357459f16e48a3770b"} Apr 22 18:21:30.516739 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:30.516514 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-trrn6" event={"ID":"218a3414-b342-4f96-b216-476f4ac84b2d","Type":"ContainerStarted","Data":"ed4fbaa5cc160f8846be5822445d5909d5505ee271490707d03695ce191d36c8"} Apr 22 18:21:30.539432 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:30.539386 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-trrn6" podStartSLOduration=6.430097126 podStartE2EDuration="39.539374628s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:20:54.302791291 +0000 UTC m=+3.481094689" lastFinishedPulling="2026-04-22 18:21:27.412068794 +0000 UTC m=+36.590372191" observedRunningTime="2026-04-22 18:21:30.538286924 +0000 UTC m=+39.716590344" watchObservedRunningTime="2026-04-22 18:21:30.539374628 +0000 UTC m=+39.717678046" Apr 22 18:21:32.444325 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:32.444287 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:32.444325 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:32.444341 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:32.444759 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:32.444438 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:32.444759 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:32.444437 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:32.444759 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:32.444494 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert podName:b7b97893-5155-42ea-878e-ea8aa0dd69d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:40.444480435 +0000 UTC m=+49.622783831 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert") pod "ingress-canary-wprc6" (UID: "b7b97893-5155-42ea-878e-ea8aa0dd69d1") : secret "canary-serving-cert" not found Apr 22 18:21:32.444759 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:32.444507 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls podName:86e8cf17-2f3c-4df9-abce-314dd430b892 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:40.444501454 +0000 UTC m=+49.622804852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls") pod "dns-default-h9kgh" (UID: "86e8cf17-2f3c-4df9-abce-314dd430b892") : secret "dns-default-metrics-tls" not found Apr 22 18:21:40.498984 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:40.498942 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:40.498984 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:40.498990 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:40.499468 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:40.499080 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:40.499468 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:40.499106 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:40.499468 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:40.499135 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls podName:86e8cf17-2f3c-4df9-abce-314dd430b892 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:56.499121059 +0000 UTC m=+65.677424456 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls") pod "dns-default-h9kgh" (UID: "86e8cf17-2f3c-4df9-abce-314dd430b892") : secret "dns-default-metrics-tls" not found Apr 22 18:21:40.499468 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:40.499170 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert podName:b7b97893-5155-42ea-878e-ea8aa0dd69d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:56.499151502 +0000 UTC m=+65.677454918 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert") pod "ingress-canary-wprc6" (UID: "b7b97893-5155-42ea-878e-ea8aa0dd69d1") : secret "canary-serving-cert" not found Apr 22 18:21:49.352625 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.352591 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz"] Apr 22 18:21:49.393116 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.393092 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz"] Apr 22 18:21:49.393258 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.393226 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" Apr 22 18:21:49.398134 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.398109 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:21:49.398422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.398402 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:21:49.398557 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.398439 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:21:49.398557 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.398459 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:21:49.398755 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.398738 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-vprpf\"" Apr 22 18:21:49.457930 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.457903 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xkct\" (UniqueName: \"kubernetes.io/projected/bdd8e71d-4d25-4dd6-af1a-e19b146bbb85-kube-api-access-6xkct\") pod \"managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz\" (UID: \"bdd8e71d-4d25-4dd6-af1a-e19b146bbb85\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" Apr 22 18:21:49.458047 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.457936 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bdd8e71d-4d25-4dd6-af1a-e19b146bbb85-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz\" (UID: \"bdd8e71d-4d25-4dd6-af1a-e19b146bbb85\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" Apr 22 18:21:49.500040 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.500021 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-55s7b" Apr 22 18:21:49.558814 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.558788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xkct\" (UniqueName: \"kubernetes.io/projected/bdd8e71d-4d25-4dd6-af1a-e19b146bbb85-kube-api-access-6xkct\") pod \"managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz\" (UID: \"bdd8e71d-4d25-4dd6-af1a-e19b146bbb85\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" Apr 22 18:21:49.558938 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.558827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bdd8e71d-4d25-4dd6-af1a-e19b146bbb85-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz\" (UID: \"bdd8e71d-4d25-4dd6-af1a-e19b146bbb85\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" Apr 22 18:21:49.561704 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.561676 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bdd8e71d-4d25-4dd6-af1a-e19b146bbb85-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz\" (UID: \"bdd8e71d-4d25-4dd6-af1a-e19b146bbb85\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" Apr 22 18:21:49.573828 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.573810 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xkct\" (UniqueName: \"kubernetes.io/projected/bdd8e71d-4d25-4dd6-af1a-e19b146bbb85-kube-api-access-6xkct\") pod \"managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz\" (UID: \"bdd8e71d-4d25-4dd6-af1a-e19b146bbb85\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" Apr 22 18:21:49.716732 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.716696 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" Apr 22 18:21:49.888068 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:49.888031 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz"] Apr 22 18:21:49.893053 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:21:49.893028 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdd8e71d_4d25_4dd6_af1a_e19b146bbb85.slice/crio-8bf4a37cba6a2a8336e0af2206be57f7e75473cbed09ca32f1b20bfa9caf7243 WatchSource:0}: Error finding container 8bf4a37cba6a2a8336e0af2206be57f7e75473cbed09ca32f1b20bfa9caf7243: Status 404 returned error can't find the container with id 8bf4a37cba6a2a8336e0af2206be57f7e75473cbed09ca32f1b20bfa9caf7243 Apr 22 18:21:50.553838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:50.553797 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" event={"ID":"bdd8e71d-4d25-4dd6-af1a-e19b146bbb85","Type":"ContainerStarted","Data":"8bf4a37cba6a2a8336e0af2206be57f7e75473cbed09ca32f1b20bfa9caf7243"} Apr 22 18:21:52.558340 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:52.558248 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" event={"ID":"bdd8e71d-4d25-4dd6-af1a-e19b146bbb85","Type":"ContainerStarted","Data":"b29cadb561757ccebfa855f8462267387b0083f63ef50bfa251366832c91fbf2"} Apr 22 18:21:52.574945 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:52.574896 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d45f9f4d5-7rvxz" podStartSLOduration=1.241791837 podStartE2EDuration="3.574882953s" podCreationTimestamp="2026-04-22 18:21:49 +0000 UTC" firstStartedPulling="2026-04-22 18:21:49.89503026 +0000 UTC m=+59.073333658" lastFinishedPulling="2026-04-22 18:21:52.228121377 +0000 UTC m=+61.406424774" observedRunningTime="2026-04-22 18:21:52.574296988 +0000 UTC m=+61.752600407" watchObservedRunningTime="2026-04-22 18:21:52.574882953 +0000 UTC m=+61.753186350" Apr 22 18:21:56.508628 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:56.508579 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:21:56.509105 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:56.508662 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:21:56.509105 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:56.508673 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:56.509105 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:56.508753 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls podName:86e8cf17-2f3c-4df9-abce-314dd430b892 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:28.508732617 +0000 UTC m=+97.687036014 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls") pod "dns-default-h9kgh" (UID: "86e8cf17-2f3c-4df9-abce-314dd430b892") : secret "dns-default-metrics-tls" not found Apr 22 18:21:56.509105 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:56.508759 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:56.509105 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:56.508808 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert podName:b7b97893-5155-42ea-878e-ea8aa0dd69d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:28.508793414 +0000 UTC m=+97.687096811 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert") pod "ingress-canary-wprc6" (UID: "b7b97893-5155-42ea-878e-ea8aa0dd69d1") : secret "canary-serving-cert" not found Apr 22 18:21:57.011946 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.011910 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:21:57.014210 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.014189 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:57.022772 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:57.022754 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:21:57.022862 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:21:57.022820 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:01.022800723 +0000 UTC m=+130.201104121 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : secret "metrics-daemon-secret" not found Apr 22 18:21:57.113010 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.112977 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:57.115666 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.115644 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:57.125913 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.125889 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:57.136981 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.136960 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzzz\" (UniqueName: \"kubernetes.io/projected/8e4c591e-67eb-4346-8dee-d861e58cc4a0-kube-api-access-5gzzz\") pod \"network-check-target-5pf5l\" (UID: \"8e4c591e-67eb-4346-8dee-d861e58cc4a0\") " pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:57.273704 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.273619 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cm497\"" Apr 22 18:21:57.281512 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.281481 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:21:57.395333 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.395303 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5pf5l"] Apr 22 18:21:57.399048 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:21:57.399015 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e4c591e_67eb_4346_8dee_d861e58cc4a0.slice/crio-87b05c388360c03a424c6e724b247c8f7862a6d6886825b6794b797376c7b110 WatchSource:0}: Error finding container 87b05c388360c03a424c6e724b247c8f7862a6d6886825b6794b797376c7b110: Status 404 returned error can't find the container with id 87b05c388360c03a424c6e724b247c8f7862a6d6886825b6794b797376c7b110 Apr 22 18:21:57.568311 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:21:57.568217 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5pf5l" event={"ID":"8e4c591e-67eb-4346-8dee-d861e58cc4a0","Type":"ContainerStarted","Data":"87b05c388360c03a424c6e724b247c8f7862a6d6886825b6794b797376c7b110"} Apr 22 18:22:01.577396 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:01.577360 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5pf5l" event={"ID":"8e4c591e-67eb-4346-8dee-d861e58cc4a0","Type":"ContainerStarted","Data":"68d6e2c8623686842f02d8098647660e1da91a5f978d507db5f8eac0d6a555fb"} Apr 22 18:22:01.577836 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:01.577499 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:22:01.594088 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:01.594029 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5pf5l" podStartSLOduration=66.961721331 podStartE2EDuration="1m10.594012102s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:21:57.401092093 +0000 UTC m=+66.579395490" lastFinishedPulling="2026-04-22 18:22:01.033382864 +0000 UTC m=+70.211686261" observedRunningTime="2026-04-22 18:22:01.593484524 +0000 UTC m=+70.771787942" watchObservedRunningTime="2026-04-22 18:22:01.594012102 +0000 UTC m=+70.772315522" Apr 22 18:22:28.531420 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:28.531384 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:22:28.531843 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:28.531429 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:22:28.531843 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:28.531531 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:22:28.531843 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:28.531535 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:22:28.531843 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:28.531601 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls podName:86e8cf17-2f3c-4df9-abce-314dd430b892 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:32.531587769 +0000 UTC m=+161.709891166 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls") pod "dns-default-h9kgh" (UID: "86e8cf17-2f3c-4df9-abce-314dd430b892") : secret "dns-default-metrics-tls" not found Apr 22 18:22:28.531843 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:28.531617 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert podName:b7b97893-5155-42ea-878e-ea8aa0dd69d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:32.531609013 +0000 UTC m=+161.709912413 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert") pod "ingress-canary-wprc6" (UID: "b7b97893-5155-42ea-878e-ea8aa0dd69d1") : secret "canary-serving-cert" not found Apr 22 18:22:32.582938 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:32.582905 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5pf5l" Apr 22 18:22:49.970292 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:49.970253 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-776f5fddd9-zf8mv"] Apr 22 18:22:49.974730 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:49.974714 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:49.977156 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:49.977137 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:22:49.978247 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:49.978226 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r4mpc\"" Apr 22 18:22:49.978917 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:49.978904 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:22:49.980716 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:49.980698 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:22:49.987214 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:49.986933 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:22:49.990316 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:49.989258 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-776f5fddd9-zf8mv"] Apr 22 18:22:50.072547 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.072512 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb"] Apr 22 18:22:50.072771 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.072748 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-ca-trust-extracted\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.072833 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.072806 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-installation-pull-secrets\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.072938 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.072914 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.073016 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.072994 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-bound-sa-token\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.073059 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.073025 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2th2\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-kube-api-access-n2th2\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.073059 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.073053 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-image-registry-private-configuration\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.073136 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.073072 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-certificates\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.073136 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.073091 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-trusted-ca\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.075433 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.075417 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9"] Apr 22 18:22:50.075594 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.075557 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb" Apr 22 18:22:50.078298 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.078279 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hrccp"] Apr 22 18:22:50.078413 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.078402 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9" Apr 22 18:22:50.079074 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.079055 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-d54t6\"" Apr 22 18:22:50.080993 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.080975 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-ltt2b\"" Apr 22 18:22:50.081128 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.081110 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.081187 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.081131 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:22:50.081517 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.081497 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:22:50.083078 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.083062 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-z685d\"" Apr 22 18:22:50.083333 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.083320 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:22:50.083687 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.083671 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:22:50.083871 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.083860 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:22:50.084114 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.084103 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:22:50.088216 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.088198 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:22:50.095788 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.095769 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb"] Apr 22 18:22:50.096700 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.096679 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9"] Apr 22 18:22:50.098133 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.098113 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hrccp"] Apr 22 18:22:50.163769 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.163742 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n"] Apr 22 18:22:50.166609 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.166595 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.168449 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.168417 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:22:50.168449 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.168435 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:22:50.168629 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.168520 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:22:50.168970 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.168955 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:22:50.169100 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.169087 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-rl6vd\"" Apr 22 18:22:50.173914 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.173898 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-snapshots\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.174001 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.173923 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-service-ca-bundle\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.174001 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.173952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174001 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.173986 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-bound-sa-token\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174137 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2th2\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-kube-api-access-n2th2\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174137 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174058 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-serving-cert\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.174137 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:50.174067 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:22:50.174137 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-image-registry-private-configuration\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174137 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:50.174116 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776f5fddd9-zf8mv: secret "image-registry-tls" not found Apr 22 18:22:50.174137 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-certificates\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174411 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174154 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5gr7\" (UniqueName: \"kubernetes.io/projected/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-kube-api-access-m5gr7\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.174411 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:50.174183 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls podName:0a1afc20-9c8c-4a8d-88a3-7199108df2f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:50.674162989 +0000 UTC m=+119.852466400 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls") pod "image-registry-776f5fddd9-zf8mv" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3") : secret "image-registry-tls" not found Apr 22 18:22:50.174411 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-trusted-ca\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174411 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174264 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bls2n\" (UniqueName: \"kubernetes.io/projected/0872edf3-f4f1-4b55-a3a3-ab4b349db77e-kube-api-access-bls2n\") pod \"volume-data-source-validator-7c6cbb6c87-4j7n9\" (UID: \"0872edf3-f4f1-4b55-a3a3-ab4b349db77e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9" Apr 22 18:22:50.174411 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174295 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-tmp\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.174411 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174363 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-ca-trust-extracted\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174411 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174390 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzgd\" (UniqueName: \"kubernetes.io/projected/542c10ef-6e44-4abe-9574-3653eac5bfae-kube-api-access-ztzgd\") pod \"network-check-source-8894fc9bd-nsxxb\" (UID: \"542c10ef-6e44-4abe-9574-3653eac5bfae\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb" Apr 22 18:22:50.174777 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174441 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-installation-pull-secrets\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174777 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174494 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.174777 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174667 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-ca-trust-extracted\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.174777 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.174715 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-certificates\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.175020 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.175003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-trusted-ca\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.176806 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.176786 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-image-registry-private-configuration\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.177220 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.177197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-installation-pull-secrets\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.177314 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.177263 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n"] Apr 22 18:22:50.182479 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.182454 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-bound-sa-token\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.183314 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.183291 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2th2\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-kube-api-access-n2th2\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.275124 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275038 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.275124 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275076 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-snapshots\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.275124 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275098 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-service-ca-bundle\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.275383 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275132 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrp5w\" (UniqueName: \"kubernetes.io/projected/37cf0ec8-5f5f-44a5-9915-565f95226bc6-kube-api-access-jrp5w\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.275383 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275198 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-serving-cert\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.275383 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5gr7\" (UniqueName: \"kubernetes.io/projected/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-kube-api-access-m5gr7\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.275383 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275251 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37cf0ec8-5f5f-44a5-9915-565f95226bc6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.275383 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275281 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bls2n\" (UniqueName: \"kubernetes.io/projected/0872edf3-f4f1-4b55-a3a3-ab4b349db77e-kube-api-access-bls2n\") pod \"volume-data-source-validator-7c6cbb6c87-4j7n9\" (UID: \"0872edf3-f4f1-4b55-a3a3-ab4b349db77e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9" Apr 22 18:22:50.275383 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275345 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37cf0ec8-5f5f-44a5-9915-565f95226bc6-config\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.275726 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-tmp\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.275726 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275441 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzgd\" (UniqueName: \"kubernetes.io/projected/542c10ef-6e44-4abe-9574-3653eac5bfae-kube-api-access-ztzgd\") pod \"network-check-source-8894fc9bd-nsxxb\" (UID: \"542c10ef-6e44-4abe-9574-3653eac5bfae\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb" Apr 22 18:22:50.275834 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-service-ca-bundle\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.275834 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275816 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-tmp\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.275834 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.275825 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-snapshots\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.276468 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.276452 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.277667 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.277648 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-serving-cert\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.284032 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.284009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5gr7\" (UniqueName: \"kubernetes.io/projected/40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e-kube-api-access-m5gr7\") pod \"insights-operator-585dfdc468-hrccp\" (UID: \"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e\") " pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.285184 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.285159 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bls2n\" (UniqueName: \"kubernetes.io/projected/0872edf3-f4f1-4b55-a3a3-ab4b349db77e-kube-api-access-bls2n\") pod \"volume-data-source-validator-7c6cbb6c87-4j7n9\" (UID: \"0872edf3-f4f1-4b55-a3a3-ab4b349db77e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9" Apr 22 18:22:50.286016 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.286000 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzgd\" (UniqueName: \"kubernetes.io/projected/542c10ef-6e44-4abe-9574-3653eac5bfae-kube-api-access-ztzgd\") pod \"network-check-source-8894fc9bd-nsxxb\" (UID: \"542c10ef-6e44-4abe-9574-3653eac5bfae\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb" Apr 22 18:22:50.375887 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.375860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37cf0ec8-5f5f-44a5-9915-565f95226bc6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.375989 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.375891 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37cf0ec8-5f5f-44a5-9915-565f95226bc6-config\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.375989 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.375937 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrp5w\" (UniqueName: \"kubernetes.io/projected/37cf0ec8-5f5f-44a5-9915-565f95226bc6-kube-api-access-jrp5w\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.376493 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.376460 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37cf0ec8-5f5f-44a5-9915-565f95226bc6-config\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.378060 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.378044 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37cf0ec8-5f5f-44a5-9915-565f95226bc6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.383472 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.383455 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrp5w\" (UniqueName: \"kubernetes.io/projected/37cf0ec8-5f5f-44a5-9915-565f95226bc6-kube-api-access-jrp5w\") pod \"service-ca-operator-d6fc45fc5-p6k9n\" (UID: \"37cf0ec8-5f5f-44a5-9915-565f95226bc6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.385383 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.385371 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb" Apr 22 18:22:50.392006 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.391988 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9" Apr 22 18:22:50.397815 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.397796 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hrccp" Apr 22 18:22:50.474850 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.474819 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" Apr 22 18:22:50.535227 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.535185 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb"] Apr 22 18:22:50.539281 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:22:50.539231 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542c10ef_6e44_4abe_9574_3653eac5bfae.slice/crio-b055a37247e0181f1e682f8539dd0acfa7a03cc2bf8a26db9f1800d48d60e30a WatchSource:0}: Error finding container b055a37247e0181f1e682f8539dd0acfa7a03cc2bf8a26db9f1800d48d60e30a: Status 404 returned error can't find the container with id b055a37247e0181f1e682f8539dd0acfa7a03cc2bf8a26db9f1800d48d60e30a Apr 22 18:22:50.601427 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.601402 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n"] Apr 22 18:22:50.604825 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:22:50.604797 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37cf0ec8_5f5f_44a5_9915_565f95226bc6.slice/crio-ffaac00559944ab3e61a54b1c3d439def2118aa97547df7707bbec7c8d14c70b WatchSource:0}: Error finding container ffaac00559944ab3e61a54b1c3d439def2118aa97547df7707bbec7c8d14c70b: Status 404 returned error can't find the container with id ffaac00559944ab3e61a54b1c3d439def2118aa97547df7707bbec7c8d14c70b Apr 22 18:22:50.667103 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.667074 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" event={"ID":"37cf0ec8-5f5f-44a5-9915-565f95226bc6","Type":"ContainerStarted","Data":"ffaac00559944ab3e61a54b1c3d439def2118aa97547df7707bbec7c8d14c70b"} Apr 22 18:22:50.668305 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.668276 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb" event={"ID":"542c10ef-6e44-4abe-9574-3653eac5bfae","Type":"ContainerStarted","Data":"f9dbf126581bbb63aa3c86828607573d1985054f4244b2dd292417d49b2dcc79"} Apr 22 18:22:50.668406 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.668312 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb" event={"ID":"542c10ef-6e44-4abe-9574-3653eac5bfae","Type":"ContainerStarted","Data":"b055a37247e0181f1e682f8539dd0acfa7a03cc2bf8a26db9f1800d48d60e30a"} Apr 22 18:22:50.678172 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.678147 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:50.681294 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:50.678344 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:22:50.681294 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:50.678363 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776f5fddd9-zf8mv: secret "image-registry-tls" not found Apr 22 18:22:50.681294 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:50.678425 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls podName:0a1afc20-9c8c-4a8d-88a3-7199108df2f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:51.678407224 +0000 UTC m=+120.856710636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls") pod "image-registry-776f5fddd9-zf8mv" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3") : secret "image-registry-tls" not found Apr 22 18:22:50.685785 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.685750 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nsxxb" podStartSLOduration=0.685737924 podStartE2EDuration="685.737924ms" podCreationTimestamp="2026-04-22 18:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:22:50.685035556 +0000 UTC m=+119.863338980" watchObservedRunningTime="2026-04-22 18:22:50.685737924 +0000 UTC m=+119.864041342" Apr 22 18:22:50.740646 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.740622 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9"] Apr 22 18:22:50.743811 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:50.743787 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hrccp"] Apr 22 18:22:50.743968 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:22:50.743938 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0872edf3_f4f1_4b55_a3a3_ab4b349db77e.slice/crio-9a7f79f9f3b246de51ec1bc22445375b8a4c5ec07699ff6597c88569bc77cba1 WatchSource:0}: Error finding container 9a7f79f9f3b246de51ec1bc22445375b8a4c5ec07699ff6597c88569bc77cba1: Status 404 returned error can't find the container with id 9a7f79f9f3b246de51ec1bc22445375b8a4c5ec07699ff6597c88569bc77cba1 Apr 22 18:22:50.747355 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:22:50.747326 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b6cfcb_98f8_49a8_8df5_3b2ce0142c7e.slice/crio-1414145c728d896e04b4ea58fc2a444b51a53a8a7e80186c000f7730a24c2be7 WatchSource:0}: Error finding container 1414145c728d896e04b4ea58fc2a444b51a53a8a7e80186c000f7730a24c2be7: Status 404 returned error can't find the container with id 1414145c728d896e04b4ea58fc2a444b51a53a8a7e80186c000f7730a24c2be7 Apr 22 18:22:51.671790 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:51.671736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hrccp" event={"ID":"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e","Type":"ContainerStarted","Data":"1414145c728d896e04b4ea58fc2a444b51a53a8a7e80186c000f7730a24c2be7"} Apr 22 18:22:51.673161 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:51.673129 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9" event={"ID":"0872edf3-f4f1-4b55-a3a3-ab4b349db77e","Type":"ContainerStarted","Data":"9a7f79f9f3b246de51ec1bc22445375b8a4c5ec07699ff6597c88569bc77cba1"} Apr 22 18:22:51.686822 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:51.686726 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:51.686953 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:51.686834 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:22:51.686953 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:51.686852 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776f5fddd9-zf8mv: secret "image-registry-tls" not found Apr 22 18:22:51.686953 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:51.686906 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls podName:0a1afc20-9c8c-4a8d-88a3-7199108df2f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:53.686891837 +0000 UTC m=+122.865195234 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls") pod "image-registry-776f5fddd9-zf8mv" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3") : secret "image-registry-tls" not found Apr 22 18:22:53.486017 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.485929 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86"] Apr 22 18:22:53.488959 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.488943 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" Apr 22 18:22:53.491072 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.491052 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:22:53.491553 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.491528 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:22:53.491692 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.491533 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6tzdx\"" Apr 22 18:22:53.499482 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.499460 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86"] Apr 22 18:22:53.602639 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.602597 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnr5\" (UniqueName: \"kubernetes.io/projected/8f2bc053-e57e-498b-b8aa-dc6e579b317d-kube-api-access-sbnr5\") pod \"migrator-74bb7799d9-jcl86\" (UID: \"8f2bc053-e57e-498b-b8aa-dc6e579b317d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" Apr 22 18:22:53.679143 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.679107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" event={"ID":"37cf0ec8-5f5f-44a5-9915-565f95226bc6","Type":"ContainerStarted","Data":"be85fd7773cee652b4ab665cdb05b12c9440ee3f70fad78d61452108d3ceddbd"} Apr 22 18:22:53.680505 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.680478 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hrccp" event={"ID":"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e","Type":"ContainerStarted","Data":"fcd79d9da1dd8c958f0216f399a5c312028b25fba0816589697efc86dd2e1a54"} Apr 22 18:22:53.681788 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.681766 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9" event={"ID":"0872edf3-f4f1-4b55-a3a3-ab4b349db77e","Type":"ContainerStarted","Data":"cec0e46748152f0371275b216130ce3def59f6625ce0e9a1fb1a1913c4da5c6c"} Apr 22 18:22:53.696258 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.696216 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" podStartSLOduration=1.085825093 podStartE2EDuration="3.696202781s" podCreationTimestamp="2026-04-22 18:22:50 +0000 UTC" firstStartedPulling="2026-04-22 18:22:50.60644354 +0000 UTC m=+119.784746940" lastFinishedPulling="2026-04-22 18:22:53.216821218 +0000 UTC m=+122.395124628" observedRunningTime="2026-04-22 18:22:53.695739747 +0000 UTC m=+122.874043166" watchObservedRunningTime="2026-04-22 18:22:53.696202781 +0000 UTC m=+122.874506201" Apr 22 18:22:53.703538 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.703511 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnr5\" (UniqueName: \"kubernetes.io/projected/8f2bc053-e57e-498b-b8aa-dc6e579b317d-kube-api-access-sbnr5\") pod \"migrator-74bb7799d9-jcl86\" (UID: \"8f2bc053-e57e-498b-b8aa-dc6e579b317d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" Apr 22 18:22:53.703667 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.703555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:53.703725 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:53.703681 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:22:53.703725 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:53.703692 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776f5fddd9-zf8mv: secret "image-registry-tls" not found Apr 22 18:22:53.703825 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:53.703737 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls podName:0a1afc20-9c8c-4a8d-88a3-7199108df2f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:57.703725105 +0000 UTC m=+126.882028502 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls") pod "image-registry-776f5fddd9-zf8mv" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3") : secret "image-registry-tls" not found Apr 22 18:22:53.725086 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.725058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnr5\" (UniqueName: \"kubernetes.io/projected/8f2bc053-e57e-498b-b8aa-dc6e579b317d-kube-api-access-sbnr5\") pod \"migrator-74bb7799d9-jcl86\" (UID: \"8f2bc053-e57e-498b-b8aa-dc6e579b317d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" Apr 22 18:22:53.738610 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.738503 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-hrccp" podStartSLOduration=1.2753207930000001 podStartE2EDuration="3.738485458s" podCreationTimestamp="2026-04-22 18:22:50 +0000 UTC" firstStartedPulling="2026-04-22 18:22:50.749337911 +0000 UTC m=+119.927641308" lastFinishedPulling="2026-04-22 18:22:53.212502563 +0000 UTC m=+122.390805973" observedRunningTime="2026-04-22 18:22:53.738450339 +0000 UTC m=+122.916753769" watchObservedRunningTime="2026-04-22 18:22:53.738485458 +0000 UTC m=+122.916788881" Apr 22 18:22:53.740125 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.740081 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4j7n9" podStartSLOduration=1.280853247 podStartE2EDuration="3.740068647s" podCreationTimestamp="2026-04-22 18:22:50 +0000 UTC" firstStartedPulling="2026-04-22 18:22:50.746054587 +0000 UTC m=+119.924357985" lastFinishedPulling="2026-04-22 18:22:53.205269973 +0000 UTC m=+122.383573385" observedRunningTime="2026-04-22 18:22:53.716202795 +0000 UTC m=+122.894506214" watchObservedRunningTime="2026-04-22 18:22:53.740068647 +0000 UTC m=+122.918372066" Apr 22 18:22:53.799298 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.799269 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" Apr 22 18:22:53.927058 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:53.927025 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86"] Apr 22 18:22:53.931088 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:22:53.931059 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2bc053_e57e_498b_b8aa_dc6e579b317d.slice/crio-f24298545e4da47fdf7543e4416207b78051e233792e4f01ab73e68ac38329b6 WatchSource:0}: Error finding container f24298545e4da47fdf7543e4416207b78051e233792e4f01ab73e68ac38329b6: Status 404 returned error can't find the container with id f24298545e4da47fdf7543e4416207b78051e233792e4f01ab73e68ac38329b6 Apr 22 18:22:54.686042 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:54.685990 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" event={"ID":"8f2bc053-e57e-498b-b8aa-dc6e579b317d","Type":"ContainerStarted","Data":"f24298545e4da47fdf7543e4416207b78051e233792e4f01ab73e68ac38329b6"} Apr 22 18:22:55.690479 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:55.690446 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" event={"ID":"8f2bc053-e57e-498b-b8aa-dc6e579b317d","Type":"ContainerStarted","Data":"e1e3e64087a2a49cf77194d0f5211b497ca77d26d8323b95176a82f4c84ca443"} Apr 22 18:22:55.690479 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:55.690484 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" event={"ID":"8f2bc053-e57e-498b-b8aa-dc6e579b317d","Type":"ContainerStarted","Data":"6609f84866fab49018ab0bd141c0d814815edaf26e629e7561a2f6db547e0448"} Apr 22 18:22:55.709963 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:55.706846 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jcl86" podStartSLOduration=1.776030826 podStartE2EDuration="2.706829929s" podCreationTimestamp="2026-04-22 18:22:53 +0000 UTC" firstStartedPulling="2026-04-22 18:22:53.933111693 +0000 UTC m=+123.111415096" lastFinishedPulling="2026-04-22 18:22:54.863910799 +0000 UTC m=+124.042214199" observedRunningTime="2026-04-22 18:22:55.705351922 +0000 UTC m=+124.883655341" watchObservedRunningTime="2026-04-22 18:22:55.706829929 +0000 UTC m=+124.885133349" Apr 22 18:22:56.933202 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:56.933175 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9qx96_df0eb815-7f84-4620-bd4e-5699eeedbf3f/dns-node-resolver/0.log" Apr 22 18:22:57.732293 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:57.732257 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:22:57.732462 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:57.732366 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:22:57.732462 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:57.732378 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776f5fddd9-zf8mv: secret "image-registry-tls" not found Apr 22 18:22:57.732462 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:22:57.732429 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls podName:0a1afc20-9c8c-4a8d-88a3-7199108df2f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:05.732414884 +0000 UTC m=+134.910718280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls") pod "image-registry-776f5fddd9-zf8mv" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3") : secret "image-registry-tls" not found Apr 22 18:22:58.331935 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:58.331907 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pvr6x_02345ce1-bba1-425d-a97e-9bacb8d17a7d/node-ca/0.log" Apr 22 18:22:59.336746 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:59.336719 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jcl86_8f2bc053-e57e-498b-b8aa-dc6e579b317d/migrator/0.log" Apr 22 18:22:59.534672 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:22:59.534639 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jcl86_8f2bc053-e57e-498b-b8aa-dc6e579b317d/graceful-termination/0.log" Apr 22 18:23:01.056645 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:01.056608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:23:01.057046 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:01.056757 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:23:01.057046 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:01.056826 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs podName:f8090e4a-c63d-4fe0-a159-0e9452335122 nodeName:}" failed. No retries permitted until 2026-04-22 18:25:03.056809104 +0000 UTC m=+252.235112518 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs") pod "network-metrics-daemon-qnx5b" (UID: "f8090e4a-c63d-4fe0-a159-0e9452335122") : secret "metrics-daemon-secret" not found Apr 22 18:23:05.795791 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:05.795753 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:23:05.798320 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:05.798298 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"image-registry-776f5fddd9-zf8mv\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:23:05.886684 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:05.886651 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r4mpc\"" Apr 22 18:23:05.894629 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:05.894609 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:23:06.015904 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:06.015872 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-776f5fddd9-zf8mv"] Apr 22 18:23:06.018852 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:06.018827 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a1afc20_9c8c_4a8d_88a3_7199108df2f3.slice/crio-33c9a18ebb086ff1dcd3b62fddb41aadf789e2cd4e83d62a88f98e8ebe05701f WatchSource:0}: Error finding container 33c9a18ebb086ff1dcd3b62fddb41aadf789e2cd4e83d62a88f98e8ebe05701f: Status 404 returned error can't find the container with id 33c9a18ebb086ff1dcd3b62fddb41aadf789e2cd4e83d62a88f98e8ebe05701f Apr 22 18:23:06.716457 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:06.716422 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" event={"ID":"0a1afc20-9c8c-4a8d-88a3-7199108df2f3","Type":"ContainerStarted","Data":"37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f"} Apr 22 18:23:06.716457 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:06.716460 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" event={"ID":"0a1afc20-9c8c-4a8d-88a3-7199108df2f3","Type":"ContainerStarted","Data":"33c9a18ebb086ff1dcd3b62fddb41aadf789e2cd4e83d62a88f98e8ebe05701f"} Apr 22 18:23:06.716709 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:06.716579 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:23:06.740498 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:06.740447 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" podStartSLOduration=17.740434624 podStartE2EDuration="17.740434624s" podCreationTimestamp="2026-04-22 18:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:23:06.739490453 +0000 UTC m=+135.917793871" watchObservedRunningTime="2026-04-22 18:23:06.740434624 +0000 UTC m=+135.918738034" Apr 22 18:23:18.188915 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.188877 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-62hkj"] Apr 22 18:23:18.195199 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.195179 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.202010 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.201991 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:23:18.202301 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.202286 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:23:18.202301 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.202297 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4b6kd\"" Apr 22 18:23:18.211689 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.211666 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-776f5fddd9-zf8mv"] Apr 22 18:23:18.212666 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.212646 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-62hkj"] Apr 22 18:23:18.254402 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.254376 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-789c4c4587-j277h"] Apr 22 18:23:18.257839 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.257823 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.271025 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.271005 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-789c4c4587-j277h"] Apr 22 18:23:18.285790 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.285767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-crio-socket\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.285887 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.285818 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-data-volume\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.285887 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.285873 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-image-registry-private-configuration\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.285969 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.285913 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmf6g\" (UniqueName: \"kubernetes.io/projected/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-kube-api-access-pmf6g\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.285969 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.285930 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-ca-trust-extracted\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.286042 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.285977 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.286042 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.286018 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-trusted-ca\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.286042 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.286039 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxj4k\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-kube-api-access-mxj4k\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.286146 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.286067 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-installation-pull-secrets\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.286146 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.286111 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-registry-tls\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.286146 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.286126 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-registry-certificates\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.286146 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.286140 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-bound-sa-token\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.286276 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.286154 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.386431 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386410 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxj4k\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-kube-api-access-mxj4k\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.386557 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-installation-pull-secrets\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.386557 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386483 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-registry-tls\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.386557 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386500 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-registry-certificates\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.386557 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386523 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-bound-sa-token\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.386752 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386693 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.386752 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-crio-socket\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.386854 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386777 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-data-volume\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.386854 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-image-registry-private-configuration\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.386854 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386818 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-crio-socket\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.387007 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386943 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmf6g\" (UniqueName: \"kubernetes.io/projected/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-kube-api-access-pmf6g\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.387007 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.386978 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-ca-trust-extracted\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.387109 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.387024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.387109 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.387054 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-trusted-ca\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.387257 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.387212 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-data-volume\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.387588 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.387337 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-registry-certificates\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.387707 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.387590 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-ca-trust-extracted\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.387837 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.387802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.388378 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.388357 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-trusted-ca\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.389349 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.389331 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.389525 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.389501 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-image-registry-private-configuration\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.389673 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.389658 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-registry-tls\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.389818 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.389802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-installation-pull-secrets\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.395667 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.395640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-bound-sa-token\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.395976 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.395955 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxj4k\" (UniqueName: \"kubernetes.io/projected/b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22-kube-api-access-mxj4k\") pod \"image-registry-789c4c4587-j277h\" (UID: \"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22\") " pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.396028 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.396006 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmf6g\" (UniqueName: \"kubernetes.io/projected/bd5b58fe-bffd-45e2-972d-e6aeb36c6a40-kube-api-access-pmf6g\") pod \"insights-runtime-extractor-62hkj\" (UID: \"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40\") " pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.504204 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.504139 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-62hkj" Apr 22 18:23:18.566358 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.566332 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:18.631652 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.631621 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-62hkj"] Apr 22 18:23:18.636220 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:18.636190 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd5b58fe_bffd_45e2_972d_e6aeb36c6a40.slice/crio-7bada5e2e0097a414c1dd4855af9d3f4d0cee277ab8ce653a911bef9f02b7c9b WatchSource:0}: Error finding container 7bada5e2e0097a414c1dd4855af9d3f4d0cee277ab8ce653a911bef9f02b7c9b: Status 404 returned error can't find the container with id 7bada5e2e0097a414c1dd4855af9d3f4d0cee277ab8ce653a911bef9f02b7c9b Apr 22 18:23:18.690955 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.690930 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-789c4c4587-j277h"] Apr 22 18:23:18.693578 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:18.693533 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb497ce63_7e7e_4e1c_bc4c_5e2e58a2ec22.slice/crio-0f22c0a27b7a005bafbd5c3d3213902f65d2036b9b001a6c9f1c9fdb24e46965 WatchSource:0}: Error finding container 0f22c0a27b7a005bafbd5c3d3213902f65d2036b9b001a6c9f1c9fdb24e46965: Status 404 returned error can't find the container with id 0f22c0a27b7a005bafbd5c3d3213902f65d2036b9b001a6c9f1c9fdb24e46965 Apr 22 18:23:18.746618 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.746592 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-62hkj" event={"ID":"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40","Type":"ContainerStarted","Data":"e61d6be8c4ec7bd8929bae05b9613f16af987387df395c6eff4ffb56f42ce3d7"} Apr 22 18:23:18.746728 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.746624 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-62hkj" event={"ID":"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40","Type":"ContainerStarted","Data":"7bada5e2e0097a414c1dd4855af9d3f4d0cee277ab8ce653a911bef9f02b7c9b"} Apr 22 18:23:18.747514 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:18.747491 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-789c4c4587-j277h" event={"ID":"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22","Type":"ContainerStarted","Data":"0f22c0a27b7a005bafbd5c3d3213902f65d2036b9b001a6c9f1c9fdb24e46965"} Apr 22 18:23:19.752199 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:19.752163 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-789c4c4587-j277h" event={"ID":"b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22","Type":"ContainerStarted","Data":"e8e832dc70e8bab8293751afab91cac6dde627149a82ca92e58e5e81d2bc2eb4"} Apr 22 18:23:19.752678 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:19.752303 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:19.754083 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:19.754056 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-62hkj" event={"ID":"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40","Type":"ContainerStarted","Data":"71967499a61dfdacdb55e4968236983472f6d5f4d5f4fe1104446af0d7e37db4"} Apr 22 18:23:19.770083 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:19.770034 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-789c4c4587-j277h" podStartSLOduration=1.770019584 podStartE2EDuration="1.770019584s" podCreationTimestamp="2026-04-22 18:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:23:19.769509788 +0000 UTC m=+148.947813207" watchObservedRunningTime="2026-04-22 18:23:19.770019584 +0000 UTC m=+148.948323003" Apr 22 18:23:20.758811 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:20.758776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-62hkj" event={"ID":"bd5b58fe-bffd-45e2-972d-e6aeb36c6a40","Type":"ContainerStarted","Data":"079bad7c68646578aed80089a1d3303a50d0925a82992b5395d9e5f80673bad5"} Apr 22 18:23:20.778197 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:20.778156 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-62hkj" podStartSLOduration=0.916262055 podStartE2EDuration="2.778141208s" podCreationTimestamp="2026-04-22 18:23:18 +0000 UTC" firstStartedPulling="2026-04-22 18:23:18.700984564 +0000 UTC m=+147.879287965" lastFinishedPulling="2026-04-22 18:23:20.562863707 +0000 UTC m=+149.741167118" observedRunningTime="2026-04-22 18:23:20.777316925 +0000 UTC m=+149.955620343" watchObservedRunningTime="2026-04-22 18:23:20.778141208 +0000 UTC m=+149.956444626" Apr 22 18:23:24.332727 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:24.332693 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw"] Apr 22 18:23:24.337034 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:24.337016 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:24.339106 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:24.339082 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-dsx2k\"" Apr 22 18:23:24.339214 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:24.339137 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:23:24.346431 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:24.346410 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw"] Apr 22 18:23:24.435707 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:24.435681 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kwvtw\" (UID: \"525ae2d0-e343-4511-a0b1-078612ff16e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:24.537092 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:24.537066 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kwvtw\" (UID: \"525ae2d0-e343-4511-a0b1-078612ff16e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:24.537222 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:24.537208 2580 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:23:24.537287 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:24.537276 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates podName:525ae2d0-e343-4511-a0b1-078612ff16e1 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:25.037261575 +0000 UTC m=+154.215564972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-kwvtw" (UID: "525ae2d0-e343-4511-a0b1-078612ff16e1") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:23:25.040434 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:25.040401 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kwvtw\" (UID: \"525ae2d0-e343-4511-a0b1-078612ff16e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:25.040626 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:25.040531 2580 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:23:25.040626 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:25.040605 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates podName:525ae2d0-e343-4511-a0b1-078612ff16e1 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:26.040588261 +0000 UTC m=+155.218891663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-kwvtw" (UID: "525ae2d0-e343-4511-a0b1-078612ff16e1") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:23:26.048062 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:26.048022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kwvtw\" (UID: \"525ae2d0-e343-4511-a0b1-078612ff16e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:26.050617 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:26.050598 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/525ae2d0-e343-4511-a0b1-078612ff16e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kwvtw\" (UID: \"525ae2d0-e343-4511-a0b1-078612ff16e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:26.145877 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:26.145846 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:26.260819 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:26.260792 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw"] Apr 22 18:23:26.264074 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:26.264043 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod525ae2d0_e343_4511_a0b1_078612ff16e1.slice/crio-0d09949ad826a0fe6ce75e37558ee02df60a0580a48ecfcbebcac9a3620b58bc WatchSource:0}: Error finding container 0d09949ad826a0fe6ce75e37558ee02df60a0580a48ecfcbebcac9a3620b58bc: Status 404 returned error can't find the container with id 0d09949ad826a0fe6ce75e37558ee02df60a0580a48ecfcbebcac9a3620b58bc Apr 22 18:23:26.775704 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:26.775666 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" event={"ID":"525ae2d0-e343-4511-a0b1-078612ff16e1","Type":"ContainerStarted","Data":"0d09949ad826a0fe6ce75e37558ee02df60a0580a48ecfcbebcac9a3620b58bc"} Apr 22 18:23:27.716811 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:27.716765 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wprc6" podUID="b7b97893-5155-42ea-878e-ea8aa0dd69d1" Apr 22 18:23:27.722903 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:27.722876 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-h9kgh" podUID="86e8cf17-2f3c-4df9-abce-314dd430b892" Apr 22 18:23:27.779292 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:27.779257 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" event={"ID":"525ae2d0-e343-4511-a0b1-078612ff16e1","Type":"ContainerStarted","Data":"88d76916b0251d9b6f46d4bde7cd1e54b377be8bca12c0b8d82cd6e78b2f0b71"} Apr 22 18:23:27.779450 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:27.779427 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h9kgh" Apr 22 18:23:27.793629 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:27.793586 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" podStartSLOduration=2.882063408 podStartE2EDuration="3.793551677s" podCreationTimestamp="2026-04-22 18:23:24 +0000 UTC" firstStartedPulling="2026-04-22 18:23:26.265855433 +0000 UTC m=+155.444158830" lastFinishedPulling="2026-04-22 18:23:27.177343688 +0000 UTC m=+156.355647099" observedRunningTime="2026-04-22 18:23:27.793269702 +0000 UTC m=+156.971573311" watchObservedRunningTime="2026-04-22 18:23:27.793551677 +0000 UTC m=+156.971855095" Apr 22 18:23:28.217067 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:28.217035 2580 patch_prober.go:28] interesting pod/image-registry-776f5fddd9-zf8mv container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:23:28.217216 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:28.217088 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" podUID="0a1afc20-9c8c-4a8d-88a3-7199108df2f3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:23:28.782315 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:28.782275 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:28.786752 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:28.786729 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kwvtw" Apr 22 18:23:29.377263 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:29.377224 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qnx5b" podUID="f8090e4a-c63d-4fe0-a159-0e9452335122" Apr 22 18:23:29.405324 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.405293 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pqzvh"] Apr 22 18:23:29.408324 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.408310 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.410876 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.410849 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:23:29.411001 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.410909 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gdsfp\"" Apr 22 18:23:29.411001 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.410922 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:23:29.411001 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.410933 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:23:29.411001 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.410940 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:23:29.411175 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.410952 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:23:29.417393 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.417377 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pqzvh"] Apr 22 18:23:29.473711 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.473685 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.473838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.473716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.473838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.473756 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vgn7\" (UniqueName: \"kubernetes.io/projected/0cefc720-886d-4a0b-9230-9ec5886be6ed-kube-api-access-8vgn7\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.473838 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.473828 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0cefc720-886d-4a0b-9230-9ec5886be6ed-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.574367 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.574334 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.574525 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.574373 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.574525 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.574413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vgn7\" (UniqueName: \"kubernetes.io/projected/0cefc720-886d-4a0b-9230-9ec5886be6ed-kube-api-access-8vgn7\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.574525 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.574469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0cefc720-886d-4a0b-9230-9ec5886be6ed-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.574525 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:29.574492 2580 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 18:23:29.574748 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:29.574595 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-tls podName:0cefc720-886d-4a0b-9230-9ec5886be6ed nodeName:}" failed. No retries permitted until 2026-04-22 18:23:30.07455353 +0000 UTC m=+159.252856932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-pqzvh" (UID: "0cefc720-886d-4a0b-9230-9ec5886be6ed") : secret "prometheus-operator-tls" not found Apr 22 18:23:29.575143 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.575126 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0cefc720-886d-4a0b-9230-9ec5886be6ed-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.576954 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.576937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:29.582981 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:29.582958 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vgn7\" (UniqueName: \"kubernetes.io/projected/0cefc720-886d-4a0b-9230-9ec5886be6ed-kube-api-access-8vgn7\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:30.078688 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:30.078640 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:30.081248 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:30.081223 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cefc720-886d-4a0b-9230-9ec5886be6ed-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pqzvh\" (UID: \"0cefc720-886d-4a0b-9230-9ec5886be6ed\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:30.318016 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:30.317979 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" Apr 22 18:23:30.436417 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:30.436373 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pqzvh"] Apr 22 18:23:30.438941 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:30.438914 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cefc720_886d_4a0b_9230_9ec5886be6ed.slice/crio-321c119682cc2d1081dca7804dddaa13521cae425a592ac4bb885baeb2330760 WatchSource:0}: Error finding container 321c119682cc2d1081dca7804dddaa13521cae425a592ac4bb885baeb2330760: Status 404 returned error can't find the container with id 321c119682cc2d1081dca7804dddaa13521cae425a592ac4bb885baeb2330760 Apr 22 18:23:30.787715 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:30.787679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" event={"ID":"0cefc720-886d-4a0b-9230-9ec5886be6ed","Type":"ContainerStarted","Data":"321c119682cc2d1081dca7804dddaa13521cae425a592ac4bb885baeb2330760"} Apr 22 18:23:31.792230 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:31.792134 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" event={"ID":"0cefc720-886d-4a0b-9230-9ec5886be6ed","Type":"ContainerStarted","Data":"6a4401cef2e66220b48fdac0f6a3ede0aca7bc310ad9f5b60286054c06893d89"} Apr 22 18:23:31.792230 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:31.792181 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" event={"ID":"0cefc720-886d-4a0b-9230-9ec5886be6ed","Type":"ContainerStarted","Data":"3ea412bd0e70f241ed3f032666d595876e9d0f5e0e918efb484205b816c862a4"} Apr 22 18:23:31.815657 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:31.815599 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-pqzvh" podStartSLOduration=1.753115136 podStartE2EDuration="2.815583724s" podCreationTimestamp="2026-04-22 18:23:29 +0000 UTC" firstStartedPulling="2026-04-22 18:23:30.440901358 +0000 UTC m=+159.619204756" lastFinishedPulling="2026-04-22 18:23:31.503369942 +0000 UTC m=+160.681673344" observedRunningTime="2026-04-22 18:23:31.813529201 +0000 UTC m=+160.991832617" watchObservedRunningTime="2026-04-22 18:23:31.815583724 +0000 UTC m=+160.993887139" Apr 22 18:23:32.596720 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:32.596678 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:23:32.596896 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:32.596761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:23:32.599306 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:32.599271 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86e8cf17-2f3c-4df9-abce-314dd430b892-metrics-tls\") pod \"dns-default-h9kgh\" (UID: \"86e8cf17-2f3c-4df9-abce-314dd430b892\") " pod="openshift-dns/dns-default-h9kgh" Apr 22 18:23:32.599428 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:32.599325 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97893-5155-42ea-878e-ea8aa0dd69d1-cert\") pod \"ingress-canary-wprc6\" (UID: \"b7b97893-5155-42ea-878e-ea8aa0dd69d1\") " pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:23:32.882479 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:32.882398 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9d4f9\"" Apr 22 18:23:32.891383 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:32.891363 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h9kgh" Apr 22 18:23:33.009812 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.009784 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h9kgh"] Apr 22 18:23:33.012428 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:33.012402 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e8cf17_2f3c_4df9_abce_314dd430b892.slice/crio-e80f9f9e11d651f5b609aa083bdd51483ab04e148d451be2bcf40e3bfaaaa484 WatchSource:0}: Error finding container e80f9f9e11d651f5b609aa083bdd51483ab04e148d451be2bcf40e3bfaaaa484: Status 404 returned error can't find the container with id e80f9f9e11d651f5b609aa083bdd51483ab04e148d451be2bcf40e3bfaaaa484 Apr 22 18:23:33.798684 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.798631 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9kgh" event={"ID":"86e8cf17-2f3c-4df9-abce-314dd430b892","Type":"ContainerStarted","Data":"e80f9f9e11d651f5b609aa083bdd51483ab04e148d451be2bcf40e3bfaaaa484"} Apr 22 18:23:33.830891 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.830683 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jl7b8"] Apr 22 18:23:33.834432 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.834399 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:33.839986 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.839950 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-j4gnj\"" Apr 22 18:23:33.839986 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.839955 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:23:33.840179 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.839955 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:23:33.843733 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.843714 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:23:33.861184 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.861155 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jl7b8"] Apr 22 18:23:33.885674 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.885643 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4kntz"] Apr 22 18:23:33.889196 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.889173 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:33.893221 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.893165 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:23:33.893221 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.893179 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:23:33.893422 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.893288 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7268c\"" Apr 22 18:23:33.893487 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.893425 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:23:33.909314 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.909280 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:33.909442 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.909355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:33.909442 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.909402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8897380f-d84c-4c36-8aec-cf476bd0da24-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:33.909442 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.909428 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxp4m\" (UniqueName: \"kubernetes.io/projected/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-api-access-gxp4m\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:33.909617 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.909480 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8897380f-d84c-4c36-8aec-cf476bd0da24-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:33.909617 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:33.909518 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.010408 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010368 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-sys\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.010640 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-wtmp\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.010640 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010472 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.010640 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010515 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8897380f-d84c-4c36-8aec-cf476bd0da24-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.010640 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010540 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.010640 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010579 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqwct\" (UniqueName: \"kubernetes.io/projected/48bd3eb6-48b1-40d5-9520-0cd198e58edf-kube-api-access-cqwct\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.010640 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010607 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48bd3eb6-48b1-40d5-9520-0cd198e58edf-metrics-client-ca\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.010943 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010688 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-root\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.010943 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010754 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-accelerators-collector-config\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.010943 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010819 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.010943 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010865 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-tls\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.011135 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010953 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-textfile\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.011135 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.010966 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8897380f-d84c-4c36-8aec-cf476bd0da24-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.011135 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.011005 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.011135 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.011040 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8897380f-d84c-4c36-8aec-cf476bd0da24-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.011135 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.011065 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxp4m\" (UniqueName: \"kubernetes.io/projected/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-api-access-gxp4m\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.011426 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.011401 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.011729 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.011710 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8897380f-d84c-4c36-8aec-cf476bd0da24-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.013796 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.013768 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.013921 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.013900 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.028710 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.028680 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxp4m\" (UniqueName: \"kubernetes.io/projected/8897380f-d84c-4c36-8aec-cf476bd0da24-kube-api-access-gxp4m\") pod \"kube-state-metrics-69db897b98-jl7b8\" (UID: \"8897380f-d84c-4c36-8aec-cf476bd0da24\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.111917 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.111818 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-accelerators-collector-config\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.111917 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.111893 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-tls\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.111923 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-textfile\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.111975 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-sys\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112008 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-wtmp\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112039 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112078 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqwct\" (UniqueName: \"kubernetes.io/projected/48bd3eb6-48b1-40d5-9520-0cd198e58edf-kube-api-access-cqwct\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48bd3eb6-48b1-40d5-9520-0cd198e58edf-metrics-client-ca\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112147 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-root\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112517 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-root\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112517 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112373 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-wtmp\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112517 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112480 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-accelerators-collector-config\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112517 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112100 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48bd3eb6-48b1-40d5-9520-0cd198e58edf-sys\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.112902 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.112876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48bd3eb6-48b1-40d5-9520-0cd198e58edf-metrics-client-ca\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.113323 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.113300 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-textfile\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.115304 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.115283 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.115474 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.115457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48bd3eb6-48b1-40d5-9520-0cd198e58edf-node-exporter-tls\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.120499 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.120475 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqwct\" (UniqueName: \"kubernetes.io/projected/48bd3eb6-48b1-40d5-9520-0cd198e58edf-kube-api-access-cqwct\") pod \"node-exporter-4kntz\" (UID: \"48bd3eb6-48b1-40d5-9520-0cd198e58edf\") " pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.146461 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.146425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" Apr 22 18:23:34.198500 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.198466 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4kntz" Apr 22 18:23:34.265054 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:34.265012 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48bd3eb6_48b1_40d5_9520_0cd198e58edf.slice/crio-c3ce7c3f525af6c5bd0dd57e703a582f60bdb2b4290ce38ef76c8d7950432174 WatchSource:0}: Error finding container c3ce7c3f525af6c5bd0dd57e703a582f60bdb2b4290ce38ef76c8d7950432174: Status 404 returned error can't find the container with id c3ce7c3f525af6c5bd0dd57e703a582f60bdb2b4290ce38ef76c8d7950432174 Apr 22 18:23:34.387018 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.386979 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jl7b8"] Apr 22 18:23:34.398540 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:34.398507 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8897380f_d84c_4c36_8aec_cf476bd0da24.slice/crio-dae569d4f6cc1c4bb5a3509ea330339eeba189dd991a0c1ad07680cabf888b57 WatchSource:0}: Error finding container dae569d4f6cc1c4bb5a3509ea330339eeba189dd991a0c1ad07680cabf888b57: Status 404 returned error can't find the container with id dae569d4f6cc1c4bb5a3509ea330339eeba189dd991a0c1ad07680cabf888b57 Apr 22 18:23:34.809754 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.809694 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4kntz" event={"ID":"48bd3eb6-48b1-40d5-9520-0cd198e58edf","Type":"ContainerStarted","Data":"c3ce7c3f525af6c5bd0dd57e703a582f60bdb2b4290ce38ef76c8d7950432174"} Apr 22 18:23:34.811071 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.811041 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" event={"ID":"8897380f-d84c-4c36-8aec-cf476bd0da24","Type":"ContainerStarted","Data":"dae569d4f6cc1c4bb5a3509ea330339eeba189dd991a0c1ad07680cabf888b57"} Apr 22 18:23:34.813185 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.813124 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9kgh" event={"ID":"86e8cf17-2f3c-4df9-abce-314dd430b892","Type":"ContainerStarted","Data":"9c5af37ac9cf61e288c1a02dd9b88cee2f78c2361c0d36651e6dd6f6c2bcef12"} Apr 22 18:23:34.813185 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.813156 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9kgh" event={"ID":"86e8cf17-2f3c-4df9-abce-314dd430b892","Type":"ContainerStarted","Data":"fac2e5ce75783404d8bfb9705965d5731d652d175f81bd5a12d005587eea469b"} Apr 22 18:23:34.813389 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.813336 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-h9kgh" Apr 22 18:23:34.837138 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:34.837088 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h9kgh" podStartSLOduration=129.546340484 podStartE2EDuration="2m10.837074829s" podCreationTimestamp="2026-04-22 18:21:24 +0000 UTC" firstStartedPulling="2026-04-22 18:23:33.014168616 +0000 UTC m=+162.192472013" lastFinishedPulling="2026-04-22 18:23:34.304902945 +0000 UTC m=+163.483206358" observedRunningTime="2026-04-22 18:23:34.834409814 +0000 UTC m=+164.012713232" watchObservedRunningTime="2026-04-22 18:23:34.837074829 +0000 UTC m=+164.015378248" Apr 22 18:23:35.818806 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:35.818700 2580 generic.go:358] "Generic (PLEG): container finished" podID="48bd3eb6-48b1-40d5-9520-0cd198e58edf" containerID="39dcdeecf5dea2fd4d972b94a7a30f18e3daf7b3edaed11fa4d66f513aa11b0e" exitCode=0 Apr 22 18:23:35.819289 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:35.818786 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4kntz" event={"ID":"48bd3eb6-48b1-40d5-9520-0cd198e58edf","Type":"ContainerDied","Data":"39dcdeecf5dea2fd4d972b94a7a30f18e3daf7b3edaed11fa4d66f513aa11b0e"} Apr 22 18:23:35.821606 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:35.821460 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" event={"ID":"8897380f-d84c-4c36-8aec-cf476bd0da24","Type":"ContainerStarted","Data":"5f878f4fedf42db0f04c5288bff677b53da2d95e097cdf2f0343c89b244ed69b"} Apr 22 18:23:35.821606 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:35.821495 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" event={"ID":"8897380f-d84c-4c36-8aec-cf476bd0da24","Type":"ContainerStarted","Data":"a638cc1d0228ec938132eeab8ba8bfb4f2c25fcae1d9b4b68cfe88c9e2f85621"} Apr 22 18:23:35.821606 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:35.821508 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" event={"ID":"8897380f-d84c-4c36-8aec-cf476bd0da24","Type":"ContainerStarted","Data":"be9b0b9b520a3214a74922872a43979442d522db6ffc86ed569892aeae781ef9"} Apr 22 18:23:35.859774 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:35.859703 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jl7b8" podStartSLOduration=1.778396928 podStartE2EDuration="2.859683809s" podCreationTimestamp="2026-04-22 18:23:33 +0000 UTC" firstStartedPulling="2026-04-22 18:23:34.400555113 +0000 UTC m=+163.578858510" lastFinishedPulling="2026-04-22 18:23:35.481841986 +0000 UTC m=+164.660145391" observedRunningTime="2026-04-22 18:23:35.859643415 +0000 UTC m=+165.037946834" watchObservedRunningTime="2026-04-22 18:23:35.859683809 +0000 UTC m=+165.037987210" Apr 22 18:23:36.825826 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:36.825787 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4kntz" event={"ID":"48bd3eb6-48b1-40d5-9520-0cd198e58edf","Type":"ContainerStarted","Data":"540eb12facde3047c1dc50498ca44b26e889c368b02eacbe86ab6caf1cdf25a2"} Apr 22 18:23:36.825826 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:36.825828 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4kntz" event={"ID":"48bd3eb6-48b1-40d5-9520-0cd198e58edf","Type":"ContainerStarted","Data":"0ab2fc32a94708808dedfa1afb77a08a2f61fbaccac377b3d970085dd38883d4"} Apr 22 18:23:38.217261 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:38.217234 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:23:38.238338 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:38.238277 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4kntz" podStartSLOduration=4.503388796 podStartE2EDuration="5.238238294s" podCreationTimestamp="2026-04-22 18:23:33 +0000 UTC" firstStartedPulling="2026-04-22 18:23:34.266653694 +0000 UTC m=+163.444957091" lastFinishedPulling="2026-04-22 18:23:35.001503178 +0000 UTC m=+164.179806589" observedRunningTime="2026-04-22 18:23:36.852400563 +0000 UTC m=+166.030703982" watchObservedRunningTime="2026-04-22 18:23:38.238238294 +0000 UTC m=+167.416541721" Apr 22 18:23:38.570450 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:38.570375 2580 patch_prober.go:28] interesting pod/image-registry-789c4c4587-j277h container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:23:38.570450 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:38.570426 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-789c4c4587-j277h" podUID="b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:23:39.356075 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:39.356037 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:23:39.358520 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:39.358499 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pkrg\"" Apr 22 18:23:39.367309 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:39.367288 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wprc6" Apr 22 18:23:39.483763 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:39.483732 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wprc6"] Apr 22 18:23:39.486999 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:23:39.486975 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7b97893_5155_42ea_878e_ea8aa0dd69d1.slice/crio-8e1647f45d15920095761dcecab07d239c07b1e68851e88c13d0b87510664791 WatchSource:0}: Error finding container 8e1647f45d15920095761dcecab07d239c07b1e68851e88c13d0b87510664791: Status 404 returned error can't find the container with id 8e1647f45d15920095761dcecab07d239c07b1e68851e88c13d0b87510664791 Apr 22 18:23:39.839885 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:39.839846 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wprc6" event={"ID":"b7b97893-5155-42ea-878e-ea8aa0dd69d1","Type":"ContainerStarted","Data":"8e1647f45d15920095761dcecab07d239c07b1e68851e88c13d0b87510664791"} Apr 22 18:23:40.763900 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:40.763874 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-789c4c4587-j277h" Apr 22 18:23:41.359805 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:41.359724 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:23:41.846871 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:41.846837 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wprc6" event={"ID":"b7b97893-5155-42ea-878e-ea8aa0dd69d1","Type":"ContainerStarted","Data":"07e9a463177d264bb3c3d6fae5f71564b692b1d76f967853166b876e85fa9c9c"} Apr 22 18:23:41.863151 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:41.863099 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wprc6" podStartSLOduration=136.263889769 podStartE2EDuration="2m17.863086146s" podCreationTimestamp="2026-04-22 18:21:24 +0000 UTC" firstStartedPulling="2026-04-22 18:23:39.488925959 +0000 UTC m=+168.667229359" lastFinishedPulling="2026-04-22 18:23:41.088122338 +0000 UTC m=+170.266425736" observedRunningTime="2026-04-22 18:23:41.862006034 +0000 UTC m=+171.040309464" watchObservedRunningTime="2026-04-22 18:23:41.863086146 +0000 UTC m=+171.041389564" Apr 22 18:23:43.231065 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.231022 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" podUID="0a1afc20-9c8c-4a8d-88a3-7199108df2f3" containerName="registry" containerID="cri-o://37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f" gracePeriod=30 Apr 22 18:23:43.479143 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.479121 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:23:43.588531 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.588445 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-ca-trust-extracted\") pod \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " Apr 22 18:23:43.588531 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.588492 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-bound-sa-token\") pod \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " Apr 22 18:23:43.588775 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.588547 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-installation-pull-secrets\") pod \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " Apr 22 18:23:43.588775 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.588622 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-image-registry-private-configuration\") pod \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " Apr 22 18:23:43.588775 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.588655 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-trusted-ca\") pod \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " Apr 22 18:23:43.588775 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.588680 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2th2\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-kube-api-access-n2th2\") pod \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " Apr 22 18:23:43.588775 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.588704 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") pod \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " Apr 22 18:23:43.588775 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.588732 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-certificates\") pod \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\" (UID: \"0a1afc20-9c8c-4a8d-88a3-7199108df2f3\") " Apr 22 18:23:43.589258 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.589199 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0a1afc20-9c8c-4a8d-88a3-7199108df2f3" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:23:43.589696 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.589654 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0a1afc20-9c8c-4a8d-88a3-7199108df2f3" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:23:43.591140 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.591111 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-kube-api-access-n2th2" (OuterVolumeSpecName: "kube-api-access-n2th2") pod "0a1afc20-9c8c-4a8d-88a3-7199108df2f3" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3"). InnerVolumeSpecName "kube-api-access-n2th2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:23:43.591469 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.591439 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0a1afc20-9c8c-4a8d-88a3-7199108df2f3" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:43.591717 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.591695 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0a1afc20-9c8c-4a8d-88a3-7199108df2f3" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:23:43.591803 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.591749 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0a1afc20-9c8c-4a8d-88a3-7199108df2f3" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:23:43.591906 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.591888 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0a1afc20-9c8c-4a8d-88a3-7199108df2f3" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:43.596399 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.596376 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0a1afc20-9c8c-4a8d-88a3-7199108df2f3" (UID: "0a1afc20-9c8c-4a8d-88a3-7199108df2f3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:23:43.690061 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.690028 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-installation-pull-secrets\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:23:43.690061 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.690060 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-image-registry-private-configuration\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:23:43.690253 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.690075 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-trusted-ca\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:23:43.690253 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.690088 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2th2\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-kube-api-access-n2th2\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:23:43.690253 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.690100 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-tls\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:23:43.690253 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.690131 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-registry-certificates\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:23:43.690253 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.690144 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-ca-trust-extracted\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:23:43.690253 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.690157 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a1afc20-9c8c-4a8d-88a3-7199108df2f3-bound-sa-token\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:23:43.852924 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.852836 2580 generic.go:358] "Generic (PLEG): container finished" podID="0a1afc20-9c8c-4a8d-88a3-7199108df2f3" containerID="37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f" exitCode=0 Apr 22 18:23:43.852924 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.852892 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" event={"ID":"0a1afc20-9c8c-4a8d-88a3-7199108df2f3","Type":"ContainerDied","Data":"37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f"} Apr 22 18:23:43.852924 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.852912 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" Apr 22 18:23:43.853178 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.852931 2580 scope.go:117] "RemoveContainer" containerID="37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f" Apr 22 18:23:43.853178 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.852920 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-776f5fddd9-zf8mv" event={"ID":"0a1afc20-9c8c-4a8d-88a3-7199108df2f3","Type":"ContainerDied","Data":"33c9a18ebb086ff1dcd3b62fddb41aadf789e2cd4e83d62a88f98e8ebe05701f"} Apr 22 18:23:43.861576 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.861537 2580 scope.go:117] "RemoveContainer" containerID="37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f" Apr 22 18:23:43.861840 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:23:43.861818 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f\": container with ID starting with 37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f not found: ID does not exist" containerID="37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f" Apr 22 18:23:43.861921 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.861852 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f"} err="failed to get container status \"37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f\": rpc error: code = NotFound desc = could not find container \"37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f\": container with ID starting with 37f3c6d1ad6a501411a090ab432f9cff7b5b04d86cc5a80e769a3e634afdf82f not found: ID does not exist" Apr 22 18:23:43.873818 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.873792 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-776f5fddd9-zf8mv"] Apr 22 18:23:43.879505 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:43.879484 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-776f5fddd9-zf8mv"] Apr 22 18:23:44.824424 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:44.824395 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h9kgh" Apr 22 18:23:45.359888 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:23:45.359853 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a1afc20-9c8c-4a8d-88a3-7199108df2f3" path="/var/lib/kubelet/pods/0a1afc20-9c8c-4a8d-88a3-7199108df2f3/volumes" Apr 22 18:24:03.905605 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:24:03.905550 2580 generic.go:358] "Generic (PLEG): container finished" podID="37cf0ec8-5f5f-44a5-9915-565f95226bc6" containerID="be85fd7773cee652b4ab665cdb05b12c9440ee3f70fad78d61452108d3ceddbd" exitCode=0 Apr 22 18:24:03.906021 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:24:03.905629 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" event={"ID":"37cf0ec8-5f5f-44a5-9915-565f95226bc6","Type":"ContainerDied","Data":"be85fd7773cee652b4ab665cdb05b12c9440ee3f70fad78d61452108d3ceddbd"} Apr 22 18:24:03.906021 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:24:03.905927 2580 scope.go:117] "RemoveContainer" containerID="be85fd7773cee652b4ab665cdb05b12c9440ee3f70fad78d61452108d3ceddbd" Apr 22 18:24:04.910064 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:24:04.910027 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-p6k9n" event={"ID":"37cf0ec8-5f5f-44a5-9915-565f95226bc6","Type":"ContainerStarted","Data":"1913eacad410eb21b5008d52e36d4189ba03b967b5d83aad9c661a2b6567abfd"} Apr 22 18:24:23.960996 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:24:23.960963 2580 generic.go:358] "Generic (PLEG): container finished" podID="40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e" containerID="fcd79d9da1dd8c958f0216f399a5c312028b25fba0816589697efc86dd2e1a54" exitCode=0 Apr 22 18:24:23.961406 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:24:23.961016 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hrccp" event={"ID":"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e","Type":"ContainerDied","Data":"fcd79d9da1dd8c958f0216f399a5c312028b25fba0816589697efc86dd2e1a54"} Apr 22 18:24:23.961406 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:24:23.961299 2580 scope.go:117] "RemoveContainer" containerID="fcd79d9da1dd8c958f0216f399a5c312028b25fba0816589697efc86dd2e1a54" Apr 22 18:24:24.976068 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:24:24.976025 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hrccp" event={"ID":"40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e","Type":"ContainerStarted","Data":"bd452a8e945ece497d12271a69fad1943a2675c6ae119470f2d46e60234e9ae2"} Apr 22 18:25:03.100336 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:03.100251 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:25:03.102808 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:03.102783 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8090e4a-c63d-4fe0-a159-0e9452335122-metrics-certs\") pod \"network-metrics-daemon-qnx5b\" (UID: \"f8090e4a-c63d-4fe0-a159-0e9452335122\") " pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:25:03.262248 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:03.262217 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fll8t\"" Apr 22 18:25:03.270862 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:03.270840 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qnx5b" Apr 22 18:25:03.388627 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:03.388535 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qnx5b"] Apr 22 18:25:03.391892 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:25:03.391861 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8090e4a_c63d_4fe0_a159_0e9452335122.slice/crio-3ea188850cfa31a6b8d7370842107e8ceee8383d18192517591bf6f290ed403e WatchSource:0}: Error finding container 3ea188850cfa31a6b8d7370842107e8ceee8383d18192517591bf6f290ed403e: Status 404 returned error can't find the container with id 3ea188850cfa31a6b8d7370842107e8ceee8383d18192517591bf6f290ed403e Apr 22 18:25:04.089432 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:04.089400 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qnx5b" event={"ID":"f8090e4a-c63d-4fe0-a159-0e9452335122","Type":"ContainerStarted","Data":"3ea188850cfa31a6b8d7370842107e8ceee8383d18192517591bf6f290ed403e"} Apr 22 18:25:05.093557 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:05.093516 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qnx5b" event={"ID":"f8090e4a-c63d-4fe0-a159-0e9452335122","Type":"ContainerStarted","Data":"cd9d91685c412316bf7b0be6416d33c4f8ea57bc5b5c5a2954822caf7db21eee"} Apr 22 18:25:05.093557 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:05.093548 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qnx5b" event={"ID":"f8090e4a-c63d-4fe0-a159-0e9452335122","Type":"ContainerStarted","Data":"c9112ac2ddac4c440c2d4d6e6d5d8748f5ced34679e4e3ec34476605c4748f6e"} Apr 22 18:25:05.113115 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:05.113063 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qnx5b" podStartSLOduration=253.17146017 podStartE2EDuration="4m14.113048508s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:25:03.393540497 +0000 UTC m=+252.571843894" lastFinishedPulling="2026-04-22 18:25:04.335128835 +0000 UTC m=+253.513432232" observedRunningTime="2026-04-22 18:25:05.110940269 +0000 UTC m=+254.289243687" watchObservedRunningTime="2026-04-22 18:25:05.113048508 +0000 UTC m=+254.291351927" Apr 22 18:25:26.501136 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.501097 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7fcpg"] Apr 22 18:25:26.501657 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.501388 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a1afc20-9c8c-4a8d-88a3-7199108df2f3" containerName="registry" Apr 22 18:25:26.501657 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.501400 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1afc20-9c8c-4a8d-88a3-7199108df2f3" containerName="registry" Apr 22 18:25:26.501657 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.501459 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a1afc20-9c8c-4a8d-88a3-7199108df2f3" containerName="registry" Apr 22 18:25:26.504551 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.504536 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.507031 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.507010 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:25:26.511384 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.511359 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7fcpg"] Apr 22 18:25:26.574721 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.574681 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7825580d-8f1e-477d-97b2-2c6aeef044f1-original-pull-secret\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.574883 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.574740 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7825580d-8f1e-477d-97b2-2c6aeef044f1-dbus\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.574883 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.574803 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7825580d-8f1e-477d-97b2-2c6aeef044f1-kubelet-config\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.676093 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.676060 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7825580d-8f1e-477d-97b2-2c6aeef044f1-original-pull-secret\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.676238 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.676110 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7825580d-8f1e-477d-97b2-2c6aeef044f1-dbus\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.676238 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.676128 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7825580d-8f1e-477d-97b2-2c6aeef044f1-kubelet-config\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.676238 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.676234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7825580d-8f1e-477d-97b2-2c6aeef044f1-kubelet-config\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.676396 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.676256 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7825580d-8f1e-477d-97b2-2c6aeef044f1-dbus\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.678539 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.678517 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7825580d-8f1e-477d-97b2-2c6aeef044f1-original-pull-secret\") pod \"global-pull-secret-syncer-7fcpg\" (UID: \"7825580d-8f1e-477d-97b2-2c6aeef044f1\") " pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.813870 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.813793 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fcpg" Apr 22 18:25:26.929765 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:26.929722 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7fcpg"] Apr 22 18:25:26.932895 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:25:26.932867 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7825580d_8f1e_477d_97b2_2c6aeef044f1.slice/crio-b1601a33cd7a8a921003dc4fb72ab8ab93064790857ddde305db09105fef9c46 WatchSource:0}: Error finding container b1601a33cd7a8a921003dc4fb72ab8ab93064790857ddde305db09105fef9c46: Status 404 returned error can't find the container with id b1601a33cd7a8a921003dc4fb72ab8ab93064790857ddde305db09105fef9c46 Apr 22 18:25:27.159249 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:27.159219 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7fcpg" event={"ID":"7825580d-8f1e-477d-97b2-2c6aeef044f1","Type":"ContainerStarted","Data":"b1601a33cd7a8a921003dc4fb72ab8ab93064790857ddde305db09105fef9c46"} Apr 22 18:25:31.173465 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:31.173428 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7fcpg" event={"ID":"7825580d-8f1e-477d-97b2-2c6aeef044f1","Type":"ContainerStarted","Data":"87ac958be267e109c25293035e49c75b50b76f3c5adcc0dc1ef781712a2e8f18"} Apr 22 18:25:31.192032 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:31.191990 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7fcpg" podStartSLOduration=1.514512904 podStartE2EDuration="5.191976587s" podCreationTimestamp="2026-04-22 18:25:26 +0000 UTC" firstStartedPulling="2026-04-22 18:25:26.934588346 +0000 UTC m=+276.112891743" lastFinishedPulling="2026-04-22 18:25:30.612052026 +0000 UTC m=+279.790355426" observedRunningTime="2026-04-22 18:25:31.19084808 +0000 UTC m=+280.369151499" watchObservedRunningTime="2026-04-22 18:25:31.191976587 +0000 UTC m=+280.370280002" Apr 22 18:25:40.428508 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.428478 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x6x2m/must-gather-64dph"] Apr 22 18:25:40.432747 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.432728 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:25:40.434926 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.434895 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x6x2m\"/\"kube-root-ca.crt\"" Apr 22 18:25:40.435059 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.435039 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x6x2m\"/\"openshift-service-ca.crt\"" Apr 22 18:25:40.435134 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.435121 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x6x2m\"/\"default-dockercfg-dbj9c\"" Apr 22 18:25:40.442301 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.442264 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x6x2m/must-gather-64dph"] Apr 22 18:25:40.584767 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.584738 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-must-gather-output\") pod \"must-gather-64dph\" (UID: \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\") " pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:25:40.584943 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.584798 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc86x\" (UniqueName: \"kubernetes.io/projected/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-kube-api-access-zc86x\") pod \"must-gather-64dph\" (UID: \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\") " pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:25:40.685209 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.685116 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc86x\" (UniqueName: \"kubernetes.io/projected/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-kube-api-access-zc86x\") pod \"must-gather-64dph\" (UID: \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\") " pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:25:40.685209 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.685176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-must-gather-output\") pod \"must-gather-64dph\" (UID: \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\") " pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:25:40.685480 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.685462 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-must-gather-output\") pod \"must-gather-64dph\" (UID: \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\") " pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:25:40.693161 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.693130 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc86x\" (UniqueName: \"kubernetes.io/projected/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-kube-api-access-zc86x\") pod \"must-gather-64dph\" (UID: \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\") " pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:25:40.741814 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.741789 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:25:40.859850 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:40.859818 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x6x2m/must-gather-64dph"] Apr 22 18:25:40.862833 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:25:40.862800 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b104fbd_f3db_4ee2_9cc8_b2fe1bc3385d.slice/crio-fcbf370a34be3973591f639aa5ead4f3a1bde0d91d8d3868b7b2b6afddea38b4 WatchSource:0}: Error finding container fcbf370a34be3973591f639aa5ead4f3a1bde0d91d8d3868b7b2b6afddea38b4: Status 404 returned error can't find the container with id fcbf370a34be3973591f639aa5ead4f3a1bde0d91d8d3868b7b2b6afddea38b4 Apr 22 18:25:41.201078 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:41.201037 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x6x2m/must-gather-64dph" event={"ID":"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d","Type":"ContainerStarted","Data":"fcbf370a34be3973591f639aa5ead4f3a1bde0d91d8d3868b7b2b6afddea38b4"} Apr 22 18:25:45.219106 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:45.219068 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x6x2m/must-gather-64dph" event={"ID":"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d","Type":"ContainerStarted","Data":"33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f"} Apr 22 18:25:46.223501 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:46.223461 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x6x2m/must-gather-64dph" event={"ID":"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d","Type":"ContainerStarted","Data":"d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92"} Apr 22 18:25:46.240639 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:46.240593 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x6x2m/must-gather-64dph" podStartSLOduration=2.027981998 podStartE2EDuration="6.240574617s" podCreationTimestamp="2026-04-22 18:25:40 +0000 UTC" firstStartedPulling="2026-04-22 18:25:40.864645369 +0000 UTC m=+290.042948766" lastFinishedPulling="2026-04-22 18:25:45.077237982 +0000 UTC m=+294.255541385" observedRunningTime="2026-04-22 18:25:46.238908038 +0000 UTC m=+295.417211457" watchObservedRunningTime="2026-04-22 18:25:46.240574617 +0000 UTC m=+295.418878028" Apr 22 18:25:51.249042 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:51.249012 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:25:58.256506 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:58.256471 2580 generic.go:358] "Generic (PLEG): container finished" podID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerID="33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f" exitCode=0 Apr 22 18:25:58.256941 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:58.256550 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x6x2m/must-gather-64dph" event={"ID":"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d","Type":"ContainerDied","Data":"33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f"} Apr 22 18:25:58.256941 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:58.256914 2580 scope.go:117] "RemoveContainer" containerID="33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f" Apr 22 18:25:59.048799 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:25:59.048770 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x6x2m_must-gather-64dph_7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d/gather/0.log" Apr 22 18:26:02.187425 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:02.187387 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7fcpg_7825580d-8f1e-477d-97b2-2c6aeef044f1/global-pull-secret-syncer/0.log" Apr 22 18:26:02.367524 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:02.367482 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6vsjn_7bcf606c-7bd7-4d2b-841c-952f4528ebad/konnectivity-agent/0.log" Apr 22 18:26:02.471445 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:02.471371 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-5.ec2.internal_c759d37c693dd42a96a635e33d6e4429/haproxy/0.log" Apr 22 18:26:04.388036 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.387977 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x6x2m/must-gather-64dph"] Apr 22 18:26:04.388430 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.388176 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-x6x2m/must-gather-64dph" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerName="copy" containerID="cri-o://d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92" gracePeriod=2 Apr 22 18:26:04.389866 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.389837 2580 status_manager.go:895] "Failed to get status for pod" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" pod="openshift-must-gather-x6x2m/must-gather-64dph" err="pods \"must-gather-64dph\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-x6x2m\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 22 18:26:04.391223 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.391201 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x6x2m/must-gather-64dph"] Apr 22 18:26:04.608240 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.608219 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x6x2m_must-gather-64dph_7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d/copy/0.log" Apr 22 18:26:04.608574 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.608542 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:26:04.610153 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.610125 2580 status_manager.go:895] "Failed to get status for pod" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" pod="openshift-must-gather-x6x2m/must-gather-64dph" err="pods \"must-gather-64dph\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-x6x2m\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 22 18:26:04.677729 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.677704 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc86x\" (UniqueName: \"kubernetes.io/projected/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-kube-api-access-zc86x\") pod \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\" (UID: \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\") " Apr 22 18:26:04.677879 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.677750 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-must-gather-output\") pod \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\" (UID: \"7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d\") " Apr 22 18:26:04.678205 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.678175 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" (UID: "7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:26:04.680119 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.680097 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-kube-api-access-zc86x" (OuterVolumeSpecName: "kube-api-access-zc86x") pod "7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" (UID: "7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d"). InnerVolumeSpecName "kube-api-access-zc86x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:26:04.778486 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.778449 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zc86x\" (UniqueName: \"kubernetes.io/projected/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-kube-api-access-zc86x\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:26:04.778486 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:04.778479 2580 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d-must-gather-output\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 22 18:26:05.277835 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.277807 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x6x2m_must-gather-64dph_7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d/copy/0.log" Apr 22 18:26:05.278150 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.278126 2580 generic.go:358] "Generic (PLEG): container finished" podID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerID="d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92" exitCode=143 Apr 22 18:26:05.278250 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.278175 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x6x2m/must-gather-64dph" Apr 22 18:26:05.278250 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.278205 2580 scope.go:117] "RemoveContainer" containerID="d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92" Apr 22 18:26:05.279935 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.279904 2580 status_manager.go:895] "Failed to get status for pod" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" pod="openshift-must-gather-x6x2m/must-gather-64dph" err="pods \"must-gather-64dph\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-x6x2m\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 22 18:26:05.285926 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.285884 2580 scope.go:117] "RemoveContainer" containerID="33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f" Apr 22 18:26:05.287501 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.287475 2580 status_manager.go:895] "Failed to get status for pod" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" pod="openshift-must-gather-x6x2m/must-gather-64dph" err="pods \"must-gather-64dph\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-x6x2m\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 22 18:26:05.297857 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.297839 2580 scope.go:117] "RemoveContainer" containerID="d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92" Apr 22 18:26:05.298117 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:26:05.298100 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92\": container with ID starting with d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92 not found: ID does not exist" containerID="d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92" Apr 22 18:26:05.298162 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.298125 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92"} err="failed to get container status \"d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92\": rpc error: code = NotFound desc = could not find container \"d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92\": container with ID starting with d7e719b718a70455e863668a2ffcf3df951191000a9d143a04bcc0737ee37b92 not found: ID does not exist" Apr 22 18:26:05.298162 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.298144 2580 scope.go:117] "RemoveContainer" containerID="33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f" Apr 22 18:26:05.298392 ip-10-0-136-5 kubenswrapper[2580]: E0422 18:26:05.298370 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f\": container with ID starting with 33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f not found: ID does not exist" containerID="33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f" Apr 22 18:26:05.298448 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.298413 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f"} err="failed to get container status \"33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f\": rpc error: code = NotFound desc = could not find container \"33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f\": container with ID starting with 33af5dd34898b6eb971aec02d0eebc724dd21a64bc45943d604ad60956246e1f not found: ID does not exist" Apr 22 18:26:05.360964 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.360933 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" path="/var/lib/kubelet/pods/7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d/volumes" Apr 22 18:26:05.925587 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.925533 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jl7b8_8897380f-d84c-4c36-8aec-cf476bd0da24/kube-state-metrics/0.log" Apr 22 18:26:05.948817 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.948790 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jl7b8_8897380f-d84c-4c36-8aec-cf476bd0da24/kube-rbac-proxy-main/0.log" Apr 22 18:26:05.975542 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:05.975520 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jl7b8_8897380f-d84c-4c36-8aec-cf476bd0da24/kube-rbac-proxy-self/0.log" Apr 22 18:26:06.097042 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:06.097017 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4kntz_48bd3eb6-48b1-40d5-9520-0cd198e58edf/node-exporter/0.log" Apr 22 18:26:06.124296 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:06.124273 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4kntz_48bd3eb6-48b1-40d5-9520-0cd198e58edf/kube-rbac-proxy/0.log" Apr 22 18:26:06.153791 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:06.153767 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4kntz_48bd3eb6-48b1-40d5-9520-0cd198e58edf/init-textfile/0.log" Apr 22 18:26:06.755765 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:06.755729 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-pqzvh_0cefc720-886d-4a0b-9230-9ec5886be6ed/prometheus-operator/0.log" Apr 22 18:26:06.779501 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:06.779461 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-pqzvh_0cefc720-886d-4a0b-9230-9ec5886be6ed/kube-rbac-proxy/0.log" Apr 22 18:26:06.805543 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:06.805519 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-kwvtw_525ae2d0-e343-4511-a0b1-078612ff16e1/prometheus-operator-admission-webhook/0.log" Apr 22 18:26:08.503160 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.503130 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l"] Apr 22 18:26:08.503528 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.503511 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerName="gather" Apr 22 18:26:08.503611 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.503530 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerName="gather" Apr 22 18:26:08.503611 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.503542 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerName="copy" Apr 22 18:26:08.503611 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.503549 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerName="copy" Apr 22 18:26:08.503720 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.503638 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerName="copy" Apr 22 18:26:08.503720 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.503653 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b104fbd-f3db-4ee2-9cc8-b2fe1bc3385d" containerName="gather" Apr 22 18:26:08.508438 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.508421 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.510456 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.510417 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5v9l6\"/\"kube-root-ca.crt\"" Apr 22 18:26:08.510585 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.510429 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5v9l6\"/\"openshift-service-ca.crt\"" Apr 22 18:26:08.510983 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.510967 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5v9l6\"/\"default-dockercfg-p2l25\"" Apr 22 18:26:08.517077 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.517051 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l"] Apr 22 18:26:08.610138 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.610106 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-proc\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.610138 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.610137 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-podres\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.610344 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.610164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-sys\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.610344 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.610179 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-lib-modules\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.610344 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.610263 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27cg5\" (UniqueName: \"kubernetes.io/projected/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-kube-api-access-27cg5\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711179 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711152 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27cg5\" (UniqueName: \"kubernetes.io/projected/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-kube-api-access-27cg5\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711348 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711208 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-proc\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711348 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711236 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-podres\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711348 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711277 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-sys\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711348 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711301 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-lib-modules\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711510 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711355 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-sys\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711510 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711355 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-proc\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711510 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-podres\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.711510 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.711434 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-lib-modules\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.718915 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.718898 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27cg5\" (UniqueName: \"kubernetes.io/projected/2654a277-55e4-4fab-bc9d-13b0ef1be1f9-kube-api-access-27cg5\") pod \"perf-node-gather-daemonset-8sb6l\" (UID: \"2654a277-55e4-4fab-bc9d-13b0ef1be1f9\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.819103 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.819017 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:08.935910 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.935888 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l"] Apr 22 18:26:08.937902 ip-10-0-136-5 kubenswrapper[2580]: W0422 18:26:08.937877 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2654a277_55e4_4fab_bc9d_13b0ef1be1f9.slice/crio-eb2ef7b0966caedd439d0db4b353d45cc5308da8e266d110b57a3be62f5a4f31 WatchSource:0}: Error finding container eb2ef7b0966caedd439d0db4b353d45cc5308da8e266d110b57a3be62f5a4f31: Status 404 returned error can't find the container with id eb2ef7b0966caedd439d0db4b353d45cc5308da8e266d110b57a3be62f5a4f31 Apr 22 18:26:08.939554 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:08.939537 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:26:09.194316 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:09.194287 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-4j7n9_0872edf3-f4f1-4b55-a3a3-ab4b349db77e/volume-data-source-validator/0.log" Apr 22 18:26:09.291170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:09.291137 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" event={"ID":"2654a277-55e4-4fab-bc9d-13b0ef1be1f9","Type":"ContainerStarted","Data":"fd83294be180df1ac62c3e81e0ac86a56d77ead61d8f72e5f2566dda5ca57a81"} Apr 22 18:26:09.291170 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:09.291172 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" event={"ID":"2654a277-55e4-4fab-bc9d-13b0ef1be1f9","Type":"ContainerStarted","Data":"eb2ef7b0966caedd439d0db4b353d45cc5308da8e266d110b57a3be62f5a4f31"} Apr 22 18:26:09.291368 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:09.291266 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:09.307320 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:09.307258 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" podStartSLOduration=1.307241988 podStartE2EDuration="1.307241988s" podCreationTimestamp="2026-04-22 18:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:26:09.306153782 +0000 UTC m=+318.484457202" watchObservedRunningTime="2026-04-22 18:26:09.307241988 +0000 UTC m=+318.485545386" Apr 22 18:26:09.892023 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:09.891996 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h9kgh_86e8cf17-2f3c-4df9-abce-314dd430b892/dns/0.log" Apr 22 18:26:09.914678 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:09.914658 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h9kgh_86e8cf17-2f3c-4df9-abce-314dd430b892/kube-rbac-proxy/0.log" Apr 22 18:26:09.938772 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:09.938747 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9qx96_df0eb815-7f84-4620-bd4e-5699eeedbf3f/dns-node-resolver/0.log" Apr 22 18:26:10.367955 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:10.367925 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-789c4c4587-j277h_b497ce63-7e7e-4e1c-bc4c-5e2e58a2ec22/registry/0.log" Apr 22 18:26:10.438488 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:10.438464 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pvr6x_02345ce1-bba1-425d-a97e-9bacb8d17a7d/node-ca/0.log" Apr 22 18:26:11.486021 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:11.485991 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wprc6_b7b97893-5155-42ea-878e-ea8aa0dd69d1/serve-healthcheck-canary/0.log" Apr 22 18:26:11.831010 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:11.830916 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hrccp_40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e/insights-operator/1.log" Apr 22 18:26:11.831889 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:11.831865 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hrccp_40b6cfcb-98f8-49a8-8df5-3b2ce0142c7e/insights-operator/0.log" Apr 22 18:26:11.855441 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:11.855414 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-62hkj_bd5b58fe-bffd-45e2-972d-e6aeb36c6a40/kube-rbac-proxy/0.log" Apr 22 18:26:11.876494 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:11.876458 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-62hkj_bd5b58fe-bffd-45e2-972d-e6aeb36c6a40/exporter/0.log" Apr 22 18:26:11.899424 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:11.899394 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-62hkj_bd5b58fe-bffd-45e2-972d-e6aeb36c6a40/extractor/0.log" Apr 22 18:26:15.303661 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:15.303635 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-8sb6l" Apr 22 18:26:16.038016 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:16.037981 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jcl86_8f2bc053-e57e-498b-b8aa-dc6e579b317d/migrator/0.log" Apr 22 18:26:16.069644 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:16.069624 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jcl86_8f2bc053-e57e-498b-b8aa-dc6e579b317d/graceful-termination/0.log" Apr 22 18:26:17.524623 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.524595 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-trrn6_218a3414-b342-4f96-b216-476f4ac84b2d/kube-multus-additional-cni-plugins/0.log" Apr 22 18:26:17.549436 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.549401 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-trrn6_218a3414-b342-4f96-b216-476f4ac84b2d/egress-router-binary-copy/0.log" Apr 22 18:26:17.570376 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.570357 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-trrn6_218a3414-b342-4f96-b216-476f4ac84b2d/cni-plugins/0.log" Apr 22 18:26:17.594088 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.594068 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-trrn6_218a3414-b342-4f96-b216-476f4ac84b2d/bond-cni-plugin/0.log" Apr 22 18:26:17.616197 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.616174 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-trrn6_218a3414-b342-4f96-b216-476f4ac84b2d/routeoverride-cni/0.log" Apr 22 18:26:17.640179 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.640159 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-trrn6_218a3414-b342-4f96-b216-476f4ac84b2d/whereabouts-cni-bincopy/0.log" Apr 22 18:26:17.661469 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.661442 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-trrn6_218a3414-b342-4f96-b216-476f4ac84b2d/whereabouts-cni/0.log" Apr 22 18:26:17.718246 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.718226 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lffvw_4f9336b9-d229-4225-bb2f-5e9509a44cdf/kube-multus/0.log" Apr 22 18:26:17.825677 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.825552 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qnx5b_f8090e4a-c63d-4fe0-a159-0e9452335122/network-metrics-daemon/0.log" Apr 22 18:26:17.848373 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:17.848347 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qnx5b_f8090e4a-c63d-4fe0-a159-0e9452335122/kube-rbac-proxy/0.log" Apr 22 18:26:18.722756 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:18.722729 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55s7b_7492c87e-50ec-4adc-9d4d-671ecf7e2492/ovn-controller/0.log" Apr 22 18:26:18.746638 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:18.746618 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55s7b_7492c87e-50ec-4adc-9d4d-671ecf7e2492/ovn-acl-logging/0.log" Apr 22 18:26:18.774746 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:18.774727 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55s7b_7492c87e-50ec-4adc-9d4d-671ecf7e2492/kube-rbac-proxy-node/0.log" Apr 22 18:26:18.801306 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:18.801286 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55s7b_7492c87e-50ec-4adc-9d4d-671ecf7e2492/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:26:18.822681 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:18.822661 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55s7b_7492c87e-50ec-4adc-9d4d-671ecf7e2492/northd/0.log" Apr 22 18:26:18.845174 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:18.845114 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55s7b_7492c87e-50ec-4adc-9d4d-671ecf7e2492/nbdb/0.log" Apr 22 18:26:18.867878 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:18.867853 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55s7b_7492c87e-50ec-4adc-9d4d-671ecf7e2492/sbdb/0.log" Apr 22 18:26:19.016529 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:19.016501 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55s7b_7492c87e-50ec-4adc-9d4d-671ecf7e2492/ovnkube-controller/0.log" Apr 22 18:26:20.737959 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:20.737932 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-nsxxb_542c10ef-6e44-4abe-9574-3653eac5bfae/check-endpoints/0.log" Apr 22 18:26:20.767345 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:20.767317 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5pf5l_8e4c591e-67eb-4346-8dee-d861e58cc4a0/network-check-target-container/0.log" Apr 22 18:26:21.732051 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:21.732023 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wxldw_69331e75-5dda-4eb0-bab5-92289e6d91b9/iptables-alerter/0.log" Apr 22 18:26:22.464395 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:22.464367 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-s9shc_740b9fc5-590e-4e08-9353-98530ff6a51d/tuned/0.log" Apr 22 18:26:25.107659 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:25.107601 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-p6k9n_37cf0ec8-5f5f-44a5-9915-565f95226bc6/service-ca-operator/1.log" Apr 22 18:26:25.109419 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:25.109379 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-p6k9n_37cf0ec8-5f5f-44a5-9915-565f95226bc6/service-ca-operator/0.log" Apr 22 18:26:25.814922 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:25.814891 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-j6xkg_a537ebfe-6f3d-459f-9bed-0c90b065a507/csi-driver/0.log" Apr 22 18:26:25.838168 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:25.838141 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-j6xkg_a537ebfe-6f3d-459f-9bed-0c90b065a507/csi-node-driver-registrar/0.log" Apr 22 18:26:25.859055 ip-10-0-136-5 kubenswrapper[2580]: I0422 18:26:25.859027 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-j6xkg_a537ebfe-6f3d-459f-9bed-0c90b065a507/csi-liveness-probe/0.log"