Apr 23 08:47:32.885446 ip-10-0-141-232 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:47:33.405392 ip-10-0-141-232 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:47:33.405392 ip-10-0-141-232 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:47:33.405392 ip-10-0-141-232 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:47:33.405392 ip-10-0-141-232 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:47:33.405392 ip-10-0-141-232 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:47:33.407388 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.407288 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:47:33.409820 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409800 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:47:33.409820 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409820 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409825 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409829 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409833 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409835 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409838 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409841 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409844 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409847 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409849 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409852 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409855 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409858 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409862 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409864 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409867 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409870 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409874 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409877 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409880 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:47:33.409894 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409882 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409885 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409888 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409891 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409894 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409897 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409900 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409903 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409906 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409909 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409912 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409915 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409917 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409920 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409922 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409927 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409931 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409934 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409936 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:47:33.410388 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409939 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409942 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409944 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409947 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409949 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409952 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409954 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409958 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409960 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409962 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409965 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409968 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409970 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409972 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409975 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409979 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409982 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409985 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409988 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409991 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:47:33.410898 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409994 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409997 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.409999 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410002 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410005 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410007 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410011 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410013 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410016 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410020 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410022 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410025 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410027 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410030 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410032 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410034 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410037 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410039 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410056 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:47:33.411419 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410059 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410063 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410067 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410070 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410073 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410075 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410078 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410526 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410532 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410535 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410537 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410540 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410543 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410545 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410548 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410551 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410553 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410556 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410558 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410562 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:47:33.411893 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410564 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410568 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410570 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410573 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410575 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410578 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410580 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410583 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410586 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410589 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410591 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410594 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410598 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410600 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410603 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410605 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410608 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410610 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410613 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410615 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:47:33.412413 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410619 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410623 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410627 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410629 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410632 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410635 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410637 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410640 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410643 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410645 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410648 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410650 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410654 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410656 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410659 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410661 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410664 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410666 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410669 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:47:33.412919 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410671 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410674 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410679 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410682 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410685 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410689 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410692 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410694 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410698 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410701 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410704 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410707 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410709 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410712 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410715 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410717 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410720 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410723 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410725 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:47:33.413409 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410728 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410730 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410733 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410735 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410738 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410740 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410743 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410746 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410748 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410752 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410754 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410757 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410760 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410762 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.410765 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410849 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410860 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410869 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410876 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410883 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410888 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410894 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:47:33.413915 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410899 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410902 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410907 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410911 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410915 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410919 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410922 2579 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410925 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410928 2579 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410931 2579 flags.go:64] FLAG: --cloud-config="" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410934 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410937 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410944 2579 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410947 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410950 2579 flags.go:64] FLAG: --config-dir="" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410953 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410957 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410961 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410970 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410973 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410976 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410980 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410983 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410986 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410989 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:47:33.414474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410992 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410996 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.410999 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411002 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411005 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411008 2579 flags.go:64] FLAG: --enable-server="true" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411012 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411019 2579 flags.go:64] FLAG: --event-burst="100" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411023 2579 flags.go:64] FLAG: --event-qps="50" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411026 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411029 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411032 2579 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411036 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411039 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411056 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411060 2579 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411063 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411066 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411069 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411072 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411076 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411079 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411082 2579 flags.go:64] FLAG: --feature-gates="" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411086 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411090 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:47:33.415116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411093 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411097 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411101 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411104 2579 flags.go:64] FLAG: --help="false" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411107 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411110 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411114 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411117 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411120 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411124 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411127 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411130 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411132 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411135 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411138 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411142 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411145 2579 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411148 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411151 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411154 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411157 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411160 2579 flags.go:64] FLAG: --lock-file="" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411163 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411166 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:47:33.415754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411169 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411175 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411178 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411181 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411184 2579 flags.go:64] FLAG: --logging-format="text" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411187 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411191 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411194 2579 flags.go:64] FLAG: --manifest-url="" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411197 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411201 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411204 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411209 2579 flags.go:64] FLAG: --max-pods="110" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411212 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411215 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411218 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411221 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411224 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411227 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411231 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411238 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411242 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411245 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411248 2579 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:47:33.416400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411251 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411258 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411261 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411264 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411270 2579 flags.go:64] FLAG: --port="10250" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411274 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411277 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0edd1a543ca6829e5" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411281 2579 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411284 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411287 2579 flags.go:64] FLAG: --register-node="true" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411290 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411293 2579 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411297 2579 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411300 2579 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411303 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411306 2579 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411310 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411313 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411316 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411319 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411323 2579 flags.go:64] FLAG: --runonce="false" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411326 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411334 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411337 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411340 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411343 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:47:33.416970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411346 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411349 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411353 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411356 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411359 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411361 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411364 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411367 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411370 2579 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411373 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411380 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411383 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411386 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411394 2579 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411397 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411400 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411403 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411406 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411409 2579 flags.go:64] FLAG: --v="2" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411413 2579 flags.go:64] FLAG: --version="false" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411418 2579 flags.go:64] FLAG: --vmodule="" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411423 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411426 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411547 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:47:33.417872 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411551 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411554 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411557 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411560 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411565 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411567 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411570 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411573 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411575 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411578 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411581 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411583 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411586 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411589 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411592 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411595 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411597 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411600 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411604 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411607 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:47:33.418905 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411609 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411612 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411615 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411617 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411620 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411622 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411651 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411656 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411660 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411663 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411667 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411671 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411674 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411676 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411678 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411681 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411686 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411689 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411692 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:47:33.419841 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411694 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411697 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411699 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411702 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411704 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411707 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411710 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411713 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411715 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411718 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411721 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411725 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411727 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411730 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411732 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411735 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411738 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411740 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411743 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411746 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:47:33.420801 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411748 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411751 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411753 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411756 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411759 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411762 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411764 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411767 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411771 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411775 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411779 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411782 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411785 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411788 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411791 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411794 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411797 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411800 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411803 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411805 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:47:33.421368 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411808 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411811 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411814 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411818 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411820 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.411823 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.411829 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.419500 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.419527 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419617 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419625 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419631 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419636 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419640 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419645 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419650 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:47:33.421972 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419654 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419659 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419664 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419669 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419673 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419677 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419682 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419686 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419690 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419695 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419700 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419704 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419708 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419712 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419717 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419721 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419725 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419729 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419734 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419738 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:47:33.422557 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419743 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419748 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419753 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419760 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419767 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419772 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419777 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419781 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419786 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419790 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419795 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419799 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419804 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419808 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419812 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419817 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419825 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419832 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419836 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419841 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:47:33.423344 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419845 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419851 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419857 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419862 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419866 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419870 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419875 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419879 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419883 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419888 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419892 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419897 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419901 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419906 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419911 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419915 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419920 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419925 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419930 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:47:33.423957 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419934 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419939 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419943 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419948 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419952 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419956 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419960 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419977 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419982 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419986 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419990 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419995 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.419999 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420003 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420007 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420011 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420015 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420019 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420024 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:47:33.424475 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420028 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.420037 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420245 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420255 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420261 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420266 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420271 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420276 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420281 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420287 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420292 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420297 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420302 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420306 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420311 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420315 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:47:33.424940 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420319 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420323 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420327 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420331 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420336 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420340 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420344 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420348 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420353 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420356 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420361 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420365 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420370 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420374 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420378 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420383 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420387 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420391 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420395 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420399 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:47:33.425404 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420403 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420407 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420412 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420416 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420421 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420425 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420429 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420434 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420439 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420444 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420448 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420453 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420457 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420461 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420466 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420470 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420474 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420478 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420483 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:47:33.425884 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420487 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420491 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420495 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420499 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420503 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420507 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420511 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420515 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420519 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420524 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420528 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420532 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420538 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420544 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420548 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420553 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420558 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420563 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420567 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:47:33.426482 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420572 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420577 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420584 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420589 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420595 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420600 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420606 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420610 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420615 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420619 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420623 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420628 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420632 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:33.420636 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.420645 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:47:33.426959 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.421462 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:47:33.427921 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.427900 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:47:33.429116 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.429102 2579 server.go:1019] "Starting client certificate rotation" Apr 23 08:47:33.429222 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.429204 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:47:33.429261 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.429252 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:47:33.458250 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.458224 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:47:33.460317 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.460294 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:47:33.474151 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.474126 2579 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:47:33.482310 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.482288 2579 log.go:25] "Validated CRI v1 image API" Apr 23 08:47:33.483837 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.483822 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:47:33.487064 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.487028 2579 fs.go:135] Filesystem UUIDs: map[4c14f503-d9d6-4261-89fc-6a69b80ccc71:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b442520e-8e37-4416-ae74-df4756ee4bae:/dev/nvme0n1p4] Apr 23 08:47:33.487137 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.487065 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:47:33.493172 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.493059 2579 manager.go:217] Machine: {Timestamp:2026-04-23 08:47:33.490973333 +0000 UTC m=+0.470510140 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103965 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c27fe63903a5e96e68eb405950355 SystemUUID:ec2c27fe-6390-3a5e-96e6-8eb405950355 BootID:8559b63d-4dbf-4df8-bb88-177c7ae69e06 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:42:38:2f:7e:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:42:38:2f:7e:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:03:69:03:49:c8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:47:33.493765 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.493754 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:47:33.493902 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.493890 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:47:33.496590 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.496546 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:47:33.496967 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.496610 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-232.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:47:33.497019 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.496979 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:47:33.497019 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.496990 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:47:33.497019 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.497005 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:47:33.498338 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.498326 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:47:33.500393 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.500383 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:47:33.500707 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.500692 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:47:33.504242 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.504229 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:47:33.504311 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.504245 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:47:33.504311 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.504258 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:47:33.504311 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.504270 2579 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:47:33.504311 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.504278 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:47:33.505440 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.505426 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:47:33.505489 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.505445 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:47:33.507535 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.507513 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:47:33.509656 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.509639 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:47:33.511746 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.511729 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:47:33.513225 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513210 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513231 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513239 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513245 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513252 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513258 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513266 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513274 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513285 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:47:33.513289 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513291 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:47:33.513524 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513306 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:47:33.513524 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.513316 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:47:33.514367 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.514352 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:47:33.514367 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.514364 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:47:33.518451 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.518436 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:47:33.518563 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.518479 2579 server.go:1295] "Started kubelet" Apr 23 08:47:33.518563 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.518544 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:47:33.518660 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.518598 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:47:33.518709 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.518675 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:47:33.519331 ip-10-0-141-232 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:47:33.520858 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.520721 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:47:33.523745 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.523727 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:47:33.524101 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.524077 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-232.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:47:33.524217 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.524188 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:47:33.525058 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.524152 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-232.ec2.internal.18a8f01b69431ada default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-232.ec2.internal,UID:ip-10-0-141-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-232.ec2.internal,},FirstTimestamp:2026-04-23 08:47:33.51844937 +0000 UTC m=+0.497986177,LastTimestamp:2026-04-23 08:47:33.51844937 +0000 UTC m=+0.497986177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-232.ec2.internal,}" Apr 23 08:47:33.530091 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.530065 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:47:33.530222 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.530194 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-232.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:47:33.530336 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.530282 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:47:33.530647 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.530634 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:47:33.531383 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531361 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:47:33.531447 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531361 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:47:33.531447 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531397 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:47:33.531447 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531414 2579 factory.go:55] Registering systemd factory Apr 23 08:47:33.531570 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531465 2579 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:47:33.531570 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531491 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:47:33.531570 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531497 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:47:33.531671 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.531613 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:33.531707 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531691 2579 factory.go:153] Registering CRI-O factory Apr 23 08:47:33.531707 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531701 2579 factory.go:223] Registration of the crio container factory successfully Apr 23 08:47:33.531807 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531796 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:47:33.531841 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531815 2579 factory.go:103] Registering Raw factory Apr 23 08:47:33.531841 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.531825 2579 manager.go:1196] Started watching for new ooms in manager Apr 23 08:47:33.532245 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.532228 2579 manager.go:319] Starting recovery of all containers Apr 23 08:47:33.537661 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.537582 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 08:47:33.537810 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.537752 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-232.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 08:47:33.543930 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.543911 2579 manager.go:324] Recovery completed Apr 23 08:47:33.545314 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.545295 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 23 08:47:33.548766 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.548754 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:47:33.552257 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.552231 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:47:33.552325 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.552270 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:47:33.552325 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.552283 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:47:33.552735 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.552720 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:47:33.552735 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.552733 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:47:33.552871 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.552752 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:47:33.555375 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.555307 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-232.ec2.internal.18a8f01b6b46f49e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-232.ec2.internal,UID:ip-10-0-141-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-232.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-232.ec2.internal,},FirstTimestamp:2026-04-23 08:47:33.552256158 +0000 UTC m=+0.531792966,LastTimestamp:2026-04-23 08:47:33.552256158 +0000 UTC m=+0.531792966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-232.ec2.internal,}" Apr 23 08:47:33.557158 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.557143 2579 policy_none.go:49] "None policy: Start" Apr 23 08:47:33.557244 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.557162 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:47:33.557244 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.557176 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:47:33.572473 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.572395 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-232.ec2.internal.18a8f01b6b474847 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-232.ec2.internal,UID:ip-10-0-141-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-141-232.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-141-232.ec2.internal,},FirstTimestamp:2026-04-23 08:47:33.552277575 +0000 UTC m=+0.531814382,LastTimestamp:2026-04-23 08:47:33.552277575 +0000 UTC m=+0.531814382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-232.ec2.internal,}" Apr 23 08:47:33.584089 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.583990 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-232.ec2.internal.18a8f01b6b477006 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-232.ec2.internal,UID:ip-10-0-141-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-141-232.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-141-232.ec2.internal,},FirstTimestamp:2026-04-23 08:47:33.55228775 +0000 UTC m=+0.531824556,LastTimestamp:2026-04-23 08:47:33.55228775 +0000 UTC m=+0.531824556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-232.ec2.internal,}" Apr 23 08:47:33.592714 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.592693 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-d2pws" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.598646 2579 manager.go:341] "Starting Device Plugin manager" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.598682 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.598691 2579 server.go:85] "Starting device plugin registration server" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.598991 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.599007 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.599116 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.599260 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.599272 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.599792 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.599828 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.603900 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.604251 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-d2pws" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.605116 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.605140 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.605157 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.605166 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:47:33.613620 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.605196 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:47:33.617513 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.617495 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:47:33.700105 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.699999 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:47:33.701031 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.701010 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:47:33.701114 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.701055 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:47:33.701114 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.701072 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:47:33.701114 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.701097 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.706161 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.706142 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal"] Apr 23 08:47:33.706221 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.706212 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:47:33.707101 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.707085 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:47:33.707183 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.707113 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:47:33.707183 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.707133 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:47:33.709190 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.709176 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:47:33.709310 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.709294 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.709347 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.709328 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:47:33.709914 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.709897 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:47:33.709976 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.709918 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:47:33.709976 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.709941 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:47:33.709976 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.709964 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:47:33.710085 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.709923 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:47:33.710085 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.710013 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:47:33.710930 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.710916 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.710980 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.710935 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-232.ec2.internal\": node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:33.712065 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.712039 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.712131 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.712077 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:47:33.712746 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.712726 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:47:33.712820 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.712754 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:47:33.712820 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.712764 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:47:33.722024 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.721999 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:33.745073 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.745030 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-232.ec2.internal\" not found" node="ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.749682 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.749665 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-232.ec2.internal\" not found" node="ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.822865 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.822828 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:33.832307 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.832277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/99512a33fd32ce0e80258f2ae365c428-config\") pod \"kube-apiserver-proxy-ip-10-0-141-232.ec2.internal\" (UID: \"99512a33fd32ce0e80258f2ae365c428\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.832307 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.832305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c4be9192100d7e0e31845a4171d4170-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal\" (UID: \"9c4be9192100d7e0e31845a4171d4170\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.832481 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.832326 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c4be9192100d7e0e31845a4171d4170-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal\" (UID: \"9c4be9192100d7e0e31845a4171d4170\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.923736 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:33.923702 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:33.933010 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.932985 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c4be9192100d7e0e31845a4171d4170-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal\" (UID: \"9c4be9192100d7e0e31845a4171d4170\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.933094 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.933017 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c4be9192100d7e0e31845a4171d4170-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal\" (UID: \"9c4be9192100d7e0e31845a4171d4170\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.933094 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.933036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/99512a33fd32ce0e80258f2ae365c428-config\") pod \"kube-apiserver-proxy-ip-10-0-141-232.ec2.internal\" (UID: \"99512a33fd32ce0e80258f2ae365c428\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.933094 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.933073 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/99512a33fd32ce0e80258f2ae365c428-config\") pod \"kube-apiserver-proxy-ip-10-0-141-232.ec2.internal\" (UID: \"99512a33fd32ce0e80258f2ae365c428\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.933196 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.933089 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c4be9192100d7e0e31845a4171d4170-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal\" (UID: \"9c4be9192100d7e0e31845a4171d4170\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:33.933196 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:33.933113 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c4be9192100d7e0e31845a4171d4170-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal\" (UID: \"9c4be9192100d7e0e31845a4171d4170\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:34.024457 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:34.024425 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:34.047970 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.047924 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:34.052790 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.052772 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" Apr 23 08:47:34.125022 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:34.124963 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:34.225580 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:34.225544 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:34.326147 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:34.326060 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:34.426665 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:34.426633 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:34.428799 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.428780 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:47:34.428964 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.428947 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:47:34.527708 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:34.527672 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:34.530881 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.530857 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:47:34.555166 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.555135 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:47:34.578664 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.578594 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mt7fq" Apr 23 08:47:34.585084 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.585059 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mt7fq" Apr 23 08:47:34.606895 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.606851 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:42:33 +0000 UTC" deadline="2027-09-25 19:46:33.830237416 +0000 UTC" Apr 23 08:47:34.606895 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.606887 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12490h58m59.22335375s" Apr 23 08:47:34.627724 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.627699 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:47:34.627881 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:34.627787 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:34.728706 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:34.728667 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-232.ec2.internal\" not found" Apr 23 08:47:34.751770 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:34.751731 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c4be9192100d7e0e31845a4171d4170.slice/crio-79a9122bca7da3189130a7f236c1b2c66d69c666fdd078e3d23badffe6b11279 WatchSource:0}: Error finding container 79a9122bca7da3189130a7f236c1b2c66d69c666fdd078e3d23badffe6b11279: Status 404 returned error can't find the container with id 79a9122bca7da3189130a7f236c1b2c66d69c666fdd078e3d23badffe6b11279 Apr 23 08:47:34.752154 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:34.752125 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99512a33fd32ce0e80258f2ae365c428.slice/crio-f739cb49c2706ce677e7dcd5ca18ccc57660c7bcaf6a35ec5e7e957ba4b82702 WatchSource:0}: Error finding container f739cb49c2706ce677e7dcd5ca18ccc57660c7bcaf6a35ec5e7e957ba4b82702: Status 404 returned error can't find the container with id f739cb49c2706ce677e7dcd5ca18ccc57660c7bcaf6a35ec5e7e957ba4b82702 Apr 23 08:47:34.755647 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.755633 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:47:34.824775 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.824747 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:47:34.831062 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.830996 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" Apr 23 08:47:34.846742 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.846719 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:47:34.847833 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.847817 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" Apr 23 08:47:34.860396 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.860372 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:47:34.995921 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:34.995889 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:47:35.505927 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.505891 2579 apiserver.go:52] "Watching apiserver" Apr 23 08:47:35.517643 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.517611 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:47:35.518102 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.518069 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal","openshift-multus/multus-lh64p","openshift-network-diagnostics/network-check-target-7lkbc","kube-system/global-pull-secret-syncer-dcmzb","openshift-dns/node-resolver-8jczv","openshift-multus/multus-additional-cni-plugins-lr5f7","openshift-multus/network-metrics-daemon-msghc","openshift-network-operator/iptables-alerter-z6vqc","openshift-ovn-kubernetes/ovnkube-node-9ltg5","kube-system/konnectivity-agent-xlcxj","kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv","openshift-cluster-node-tuning-operator/tuned-8x5bw","openshift-image-registry/node-ca-kqtvw"] Apr 23 08:47:35.521647 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.521622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.523920 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.523894 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.525156 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.524999 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:47:35.525156 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.525096 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:47:35.525156 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.525036 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:47:35.525545 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.525528 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:47:35.525955 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.525889 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:47:35.526501 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.526485 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:47:35.526601 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.526541 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:35.526665 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.526630 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:35.527523 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.527500 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b8zhm\"" Apr 23 08:47:35.527622 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.527538 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:47:35.527622 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.527571 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j9zjj\"" Apr 23 08:47:35.527894 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.527825 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:47:35.528333 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.528310 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:47:35.528410 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.528331 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:47:35.528410 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.528331 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:47:35.528904 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.528886 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.533372 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.533348 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:47:35.533658 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.533636 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:47:35.533714 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.533698 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:47:35.534073 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.534029 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5lxjk\"" Apr 23 08:47:35.534374 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.534354 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.537344 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.537307 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.537470 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.537412 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.538370 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.538215 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:47:35.538370 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.538259 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tnkm9\"" Apr 23 08:47:35.538370 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.538317 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:47:35.539914 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.539894 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.540989 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.540881 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:47:35.540989 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.540894 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fqh4q\"" Apr 23 08:47:35.541537 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.541305 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:47:35.541537 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.541306 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.541782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-kubelet\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.541823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-cnibin\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.541854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-cni-multus\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.541913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-hostroot\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.541960 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-conf-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.541987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-multus-certs\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542012 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542069 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53bacb41-fc36-4333-8726-ced662fc325c-cni-binary-copy\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542086 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-kubelet\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8c9848f-46ab-4976-a1d7-0f02cd542da7-host\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-system-cni-dir\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542166 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-cnibin\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-run-netns\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542215 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-systemd\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542236 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-var-lib-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.542611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-etc-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542338 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-77bxt\"" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542341 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542368 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovnkube-script-lib\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-os-release\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542414 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-socket-dir-parent\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542439 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-systemd-units\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-slash\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542485 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-node-log\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-system-cni-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-cni-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542561 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-cni-bin\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542587 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgd8t\" (UniqueName: \"kubernetes.io/projected/b8c9848f-46ab-4976-a1d7-0f02cd542da7-kube-api-access-tgd8t\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d506f42-bf26-438f-96ee-d33036fd434d-tmp-dir\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnv6v\" (UniqueName: \"kubernetes.io/projected/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-kube-api-access-pnv6v\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-k8s-cni-cncf-io\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542732 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d506f42-bf26-438f-96ee-d33036fd434d-hosts-file\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87lcf\" (UniqueName: \"kubernetes.io/projected/4d506f42-bf26-438f-96ee-d33036fd434d-kube-api-access-87lcf\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.543465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542808 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542840 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-cni-bin\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542919 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-etc-kubernetes\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542944 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtq8b\" (UniqueName: \"kubernetes.io/projected/53bacb41-fc36-4333-8726-ced662fc325c-kube-api-access-mtq8b\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-cni-binary-copy\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543001 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543061 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5hdm4\"" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.542998 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ztd6\" (UniqueName: \"kubernetes.io/projected/0b4f3610-c641-4979-b0b6-27f27e89efe0-kube-api-access-8ztd6\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543215 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-ovn\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-log-socket\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543303 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-cni-netd\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543319 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-netns\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.544181 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543335 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-os-release\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.544942 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovnkube-config\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544942 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543382 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-env-overrides\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544942 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543399 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovn-node-metrics-cert\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.544942 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53bacb41-fc36-4333-8726-ced662fc325c-multus-daemon-config\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.544942 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543471 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b8c9848f-46ab-4976-a1d7-0f02cd542da7-serviceca\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.544942 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.543692 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:47:35.545529 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.545508 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:35.548457 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.548127 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:35.548457 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.548216 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:35.548733 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.548626 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:47:35.548804 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.548755 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n2m64\"" Apr 23 08:47:35.549479 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.549280 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:47:35.550543 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.550524 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.550626 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.550582 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:35.550626 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.550530 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.553090 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.553072 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jc9lv\"" Apr 23 08:47:35.553285 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.553267 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:47:35.553285 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.553278 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:47:35.586819 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.586781 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:42:34 +0000 UTC" deadline="2027-09-29 01:26:38.840461956 +0000 UTC" Apr 23 08:47:35.586819 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.586816 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12568h39m3.253650184s" Apr 23 08:47:35.610934 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.610870 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" event={"ID":"99512a33fd32ce0e80258f2ae365c428","Type":"ContainerStarted","Data":"f739cb49c2706ce677e7dcd5ca18ccc57660c7bcaf6a35ec5e7e957ba4b82702"} Apr 23 08:47:35.612206 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.612171 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" event={"ID":"9c4be9192100d7e0e31845a4171d4170","Type":"ContainerStarted","Data":"79a9122bca7da3189130a7f236c1b2c66d69c666fdd078e3d23badffe6b11279"} Apr 23 08:47:35.633640 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.633606 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:47:35.644722 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-etc-kubernetes\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.644891 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-cni-binary-copy\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.644891 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ztd6\" (UniqueName: \"kubernetes.io/projected/0b4f3610-c641-4979-b0b6-27f27e89efe0-kube-api-access-8ztd6\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.644891 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-etc-kubernetes\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.644891 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce69dc2d-3bc0-419d-b6d3-c42290984071-agent-certs\") pod \"konnectivity-agent-xlcxj\" (UID: \"ce69dc2d-3bc0-419d-b6d3-c42290984071\") " pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:35.644891 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysconfig\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.644891 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-ovn\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.644891 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644886 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-cni-netd\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-netns\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644959 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-os-release\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.644984 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-ovn\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zp7x\" (UniqueName: \"kubernetes.io/projected/4382bbb7-55ae-4eb9-a5be-4925147cf49d-kube-api-access-7zp7x\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645077 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53bacb41-fc36-4333-8726-ced662fc325c-multus-daemon-config\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645092 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645111 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18df44e5-6990-4d9e-89c1-25b5e449602b-host-slash\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645138 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-host\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645200 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-os-release\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfg4m\" (UniqueName: \"kubernetes.io/projected/321aa24e-593f-4b13-bd8d-31c635ce80f5-kube-api-access-bfg4m\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-kubelet\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645248 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-cni-netd\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645141 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-netns\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-cni-multus\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-hostroot\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645320 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-kubelet\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8c9848f-46ab-4976-a1d7-0f02cd542da7-host\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645367 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce69dc2d-3bc0-419d-b6d3-c42290984071-konnectivity-ca\") pod \"konnectivity-agent-xlcxj\" (UID: \"ce69dc2d-3bc0-419d-b6d3-c42290984071\") " pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-hostroot\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-cni-multus\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645397 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53bacb41-fc36-4333-8726-ced662fc325c-cni-binary-copy\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-system-cni-dir\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645555 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-kubelet-config\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-cni-binary-copy\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645580 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-sys\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645669 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8c9848f-46ab-4976-a1d7-0f02cd542da7-host\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53bacb41-fc36-4333-8726-ced662fc325c-multus-daemon-config\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-tuned\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645716 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-system-cni-dir\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.645973 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-run-netns\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645772 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-run-netns\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-var-lib-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-var-lib-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovnkube-script-lib\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-os-release\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645925 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-cni-bin\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645960 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.645987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-registration-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-run\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646116 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-cni-bin\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646134 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-systemd-units\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646163 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-slash\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-cni-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgd8t\" (UniqueName: \"kubernetes.io/projected/b8c9848f-46ab-4976-a1d7-0f02cd542da7-kube-api-access-tgd8t\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-cni-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53bacb41-fc36-4333-8726-ced662fc325c-cni-binary-copy\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.646771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646349 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-systemd-units\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646351 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42xv2\" (UniqueName: \"kubernetes.io/projected/d7eddd42-5408-4933-957d-9e3947f66e44-kube-api-access-42xv2\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646393 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-slash\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646423 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnv6v\" (UniqueName: \"kubernetes.io/projected/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-kube-api-access-pnv6v\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovnkube-script-lib\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646188 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-os-release\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d506f42-bf26-438f-96ee-d33036fd434d-hosts-file\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646779 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-sys-fs\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646804 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/18df44e5-6990-4d9e-89c1-25b5e449602b-iptables-alerter-script\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646821 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d506f42-bf26-438f-96ee-d33036fd434d-hosts-file\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646831 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-var-lib-kubelet\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/321aa24e-593f-4b13-bd8d-31c635ce80f5-tmp\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646895 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-cni-bin\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.647594 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtq8b\" (UniqueName: \"kubernetes.io/projected/53bacb41-fc36-4333-8726-ced662fc325c-kube-api-access-mtq8b\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646964 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-cni-bin\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.646993 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-log-socket\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647084 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-socket-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovnkube-config\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-env-overrides\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647160 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-log-socket\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647170 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovn-node-metrics-cert\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647483 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-env-overrides\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0b4f3610-c641-4979-b0b6-27f27e89efe0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b8c9848f-46ab-4976-a1d7-0f02cd542da7-serviceca\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.647763 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfr47\" (UniqueName: \"kubernetes.io/projected/18df44e5-6990-4d9e-89c1-25b5e449602b-kube-api-access-dfr47\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovnkube-config\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-modprobe-d\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.648358 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648205 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-lib-modules\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-cnibin\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-conf-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648282 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-multus-certs\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648319 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-device-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648343 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648363 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-kubelet\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-cnibin\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648414 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-dbus\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648439 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-kubernetes\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysctl-d\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648511 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-systemd\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-etc-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-socket-dir-parent\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-systemd\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-node-log\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.649127 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-system-cni-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648681 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-multus-certs\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d506f42-bf26-438f-96ee-d33036fd434d-tmp-dir\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648740 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-socket-dir-parent\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648758 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-multus-conf-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648775 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-etc-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648769 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-cnibin\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648793 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b4f3610-c641-4979-b0b6-27f27e89efe0-cnibin\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648830 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-system-cni-dir\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648875 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-systemd\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-node-log\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.648949 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-var-lib-kubelet\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.649000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87lcf\" (UniqueName: \"kubernetes.io/projected/4d506f42-bf26-438f-96ee-d33036fd434d-kube-api-access-87lcf\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.649032 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-k8s-cni-cncf-io\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.649070 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b8c9848f-46ab-4976-a1d7-0f02cd542da7-serviceca\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.649077 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysctl-conf\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.649003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d506f42-bf26-438f-96ee-d33036fd434d-tmp-dir\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.649842 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.649107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.650638 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.649143 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-run-openvswitch\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.650638 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.649186 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53bacb41-fc36-4333-8726-ced662fc325c-host-run-k8s-cni-cncf-io\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.651129 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.651097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-ovn-node-metrics-cert\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.652086 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.652029 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:47:35.652086 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.652076 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:47:35.652086 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.652092 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cxjq5 for pod openshift-network-diagnostics/network-check-target-7lkbc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:35.652297 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.652171 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5 podName:d41d976f-3b2d-457c-b9ac-f655f79bc0b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:36.152150097 +0000 UTC m=+3.131686897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cxjq5" (UniqueName: "kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5") pod "network-check-target-7lkbc" (UID: "d41d976f-3b2d-457c-b9ac-f655f79bc0b2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:35.657245 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.657147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtq8b\" (UniqueName: \"kubernetes.io/projected/53bacb41-fc36-4333-8726-ced662fc325c-kube-api-access-mtq8b\") pod \"multus-lh64p\" (UID: \"53bacb41-fc36-4333-8726-ced662fc325c\") " pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.657245 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.657147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnv6v\" (UniqueName: \"kubernetes.io/projected/ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a-kube-api-access-pnv6v\") pod \"ovnkube-node-9ltg5\" (UID: \"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.657552 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.657532 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgd8t\" (UniqueName: \"kubernetes.io/projected/b8c9848f-46ab-4976-a1d7-0f02cd542da7-kube-api-access-tgd8t\") pod \"node-ca-kqtvw\" (UID: \"b8c9848f-46ab-4976-a1d7-0f02cd542da7\") " pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.657800 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.657779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ztd6\" (UniqueName: \"kubernetes.io/projected/0b4f3610-c641-4979-b0b6-27f27e89efe0-kube-api-access-8ztd6\") pod \"multus-additional-cni-plugins-lr5f7\" (UID: \"0b4f3610-c641-4979-b0b6-27f27e89efe0\") " pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.658937 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.658898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87lcf\" (UniqueName: \"kubernetes.io/projected/4d506f42-bf26-438f-96ee-d33036fd434d-kube-api-access-87lcf\") pod \"node-resolver-8jczv\" (UID: \"4d506f42-bf26-438f-96ee-d33036fd434d\") " pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.749817 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfr47\" (UniqueName: \"kubernetes.io/projected/18df44e5-6990-4d9e-89c1-25b5e449602b-kube-api-access-dfr47\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.749986 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-modprobe-d\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.749986 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-lib-modules\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.749986 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749885 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-device-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.749986 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.749986 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-dbus\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.749986 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-kubernetes\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.749986 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysctl-d\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749981 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-modprobe-d\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.749985 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-lib-modules\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750009 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-systemd\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-device-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750087 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-systemd\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-kubernetes\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750124 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysctl-d\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-dbus\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysctl-conf\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce69dc2d-3bc0-419d-b6d3-c42290984071-agent-certs\") pod \"konnectivity-agent-xlcxj\" (UID: \"ce69dc2d-3bc0-419d-b6d3-c42290984071\") " pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysconfig\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zp7x\" (UniqueName: \"kubernetes.io/projected/4382bbb7-55ae-4eb9-a5be-4925147cf49d-kube-api-access-7zp7x\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysconfig\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750371 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18df44e5-6990-4d9e-89c1-25b5e449602b-host-slash\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.750378 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750319 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-sysctl-conf\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750397 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-host\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfg4m\" (UniqueName: \"kubernetes.io/projected/321aa24e-593f-4b13-bd8d-31c635ce80f5-kube-api-access-bfg4m\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.750438 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce69dc2d-3bc0-419d-b6d3-c42290984071-konnectivity-ca\") pod \"konnectivity-agent-xlcxj\" (UID: \"ce69dc2d-3bc0-419d-b6d3-c42290984071\") " pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750550 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-kubelet-config\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-sys\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-tuned\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-registration-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-run\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42xv2\" (UniqueName: \"kubernetes.io/projected/d7eddd42-5408-4933-957d-9e3947f66e44-kube-api-access-42xv2\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-sys\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-sys-fs\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750716 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-host\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.750731 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret podName:7c7ca0ff-2599-42db-a76b-05f1bb8b8d60 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:36.250714148 +0000 UTC m=+3.230250945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret") pod "global-pull-secret-syncer-dcmzb" (UID: "7c7ca0ff-2599-42db-a76b-05f1bb8b8d60") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:35.751207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750770 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18df44e5-6990-4d9e-89c1-25b5e449602b-host-slash\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-kubelet-config\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750810 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-run\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750818 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/18df44e5-6990-4d9e-89c1-25b5e449602b-iptables-alerter-script\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.750867 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-var-lib-kubelet\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750900 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/321aa24e-593f-4b13-bd8d-31c635ce80f5-tmp\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:35.750914 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:47:36.250897986 +0000 UTC m=+3.230434781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750948 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-socket-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.751111 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/321aa24e-593f-4b13-bd8d-31c635ce80f5-var-lib-kubelet\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.751117 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce69dc2d-3bc0-419d-b6d3-c42290984071-konnectivity-ca\") pod \"konnectivity-agent-xlcxj\" (UID: \"ce69dc2d-3bc0-419d-b6d3-c42290984071\") " pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-registration-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.750772 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-sys-fs\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.751156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7eddd42-5408-4933-957d-9e3947f66e44-socket-dir\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.752012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.751358 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/18df44e5-6990-4d9e-89c1-25b5e449602b-iptables-alerter-script\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.753412 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.753366 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/321aa24e-593f-4b13-bd8d-31c635ce80f5-etc-tuned\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.753412 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.753394 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/321aa24e-593f-4b13-bd8d-31c635ce80f5-tmp\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.753537 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.753469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce69dc2d-3bc0-419d-b6d3-c42290984071-agent-certs\") pod \"konnectivity-agent-xlcxj\" (UID: \"ce69dc2d-3bc0-419d-b6d3-c42290984071\") " pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:35.762772 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.762607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfr47\" (UniqueName: \"kubernetes.io/projected/18df44e5-6990-4d9e-89c1-25b5e449602b-kube-api-access-dfr47\") pod \"iptables-alerter-z6vqc\" (UID: \"18df44e5-6990-4d9e-89c1-25b5e449602b\") " pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.765918 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.765887 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zp7x\" (UniqueName: \"kubernetes.io/projected/4382bbb7-55ae-4eb9-a5be-4925147cf49d-kube-api-access-7zp7x\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:35.768391 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.768361 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfg4m\" (UniqueName: \"kubernetes.io/projected/321aa24e-593f-4b13-bd8d-31c635ce80f5-kube-api-access-bfg4m\") pod \"tuned-8x5bw\" (UID: \"321aa24e-593f-4b13-bd8d-31c635ce80f5\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:35.768391 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.768380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42xv2\" (UniqueName: \"kubernetes.io/projected/d7eddd42-5408-4933-957d-9e3947f66e44-kube-api-access-42xv2\") pod \"aws-ebs-csi-driver-node-dtfbv\" (UID: \"d7eddd42-5408-4933-957d-9e3947f66e44\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.834677 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.834607 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:35.844778 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.844739 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lh64p" Apr 23 08:47:35.854864 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.854835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kqtvw" Apr 23 08:47:35.861542 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.861515 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8jczv" Apr 23 08:47:35.869237 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.869203 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" Apr 23 08:47:35.876961 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.876926 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" Apr 23 08:47:35.884665 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.884631 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z6vqc" Apr 23 08:47:35.899473 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.899442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:35.907307 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:35.907281 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" Apr 23 08:47:36.153720 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.153645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:36.153862 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:36.153792 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:47:36.153862 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:36.153811 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:47:36.153862 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:36.153822 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cxjq5 for pod openshift-network-diagnostics/network-check-target-7lkbc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:36.154006 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:36.153877 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5 podName:d41d976f-3b2d-457c-b9ac-f655f79bc0b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:37.153858455 +0000 UTC m=+4.133395256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxjq5" (UniqueName: "kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5") pod "network-check-target-7lkbc" (UID: "d41d976f-3b2d-457c-b9ac-f655f79bc0b2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:36.254164 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.254124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:36.254328 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.254187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:36.254328 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:36.254287 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:36.254328 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:36.254291 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:36.254496 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:36.254348 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret podName:7c7ca0ff-2599-42db-a76b-05f1bb8b8d60 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:37.254330422 +0000 UTC m=+4.233867216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret") pod "global-pull-secret-syncer-dcmzb" (UID: "7c7ca0ff-2599-42db-a76b-05f1bb8b8d60") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:36.254496 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:36.254365 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:47:37.254356893 +0000 UTC m=+4.233893688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:36.424054 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.423786 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee4b8a45_674b_4a0c_ae63_3fe36bbbf85a.slice/crio-f64579ec86bb3f1e466e48186d206e053c4d22d551cca3e15361f660b772d4ea WatchSource:0}: Error finding container f64579ec86bb3f1e466e48186d206e053c4d22d551cca3e15361f660b772d4ea: Status 404 returned error can't find the container with id f64579ec86bb3f1e466e48186d206e053c4d22d551cca3e15361f660b772d4ea Apr 23 08:47:36.427311 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.427284 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53bacb41_fc36_4333_8726_ced662fc325c.slice/crio-6b9ee418b732d8194c521096011e8c1376f533f6c4bd2e01621c1fcb6abf997b WatchSource:0}: Error finding container 6b9ee418b732d8194c521096011e8c1376f533f6c4bd2e01621c1fcb6abf997b: Status 404 returned error can't find the container with id 6b9ee418b732d8194c521096011e8c1376f533f6c4bd2e01621c1fcb6abf997b Apr 23 08:47:36.429028 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.429004 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7eddd42_5408_4933_957d_9e3947f66e44.slice/crio-c47889eb253fec7dc04cedfd547dd116ca2fa359462254391bcac6f7e55ff582 WatchSource:0}: Error finding container c47889eb253fec7dc04cedfd547dd116ca2fa359462254391bcac6f7e55ff582: Status 404 returned error can't find the container with id c47889eb253fec7dc04cedfd547dd116ca2fa359462254391bcac6f7e55ff582 Apr 23 08:47:36.429688 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.429668 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod321aa24e_593f_4b13_bd8d_31c635ce80f5.slice/crio-349dbf7e13351a4f6845cd992370e334cf725290f9843ba1bc396cd41b94239b WatchSource:0}: Error finding container 349dbf7e13351a4f6845cd992370e334cf725290f9843ba1bc396cd41b94239b: Status 404 returned error can't find the container with id 349dbf7e13351a4f6845cd992370e334cf725290f9843ba1bc396cd41b94239b Apr 23 08:47:36.430713 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.430688 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18df44e5_6990_4d9e_89c1_25b5e449602b.slice/crio-01895fe5f056eee04ffe240fc01124adffd0e3fe9e55e5389a64f2175ebd192e WatchSource:0}: Error finding container 01895fe5f056eee04ffe240fc01124adffd0e3fe9e55e5389a64f2175ebd192e: Status 404 returned error can't find the container with id 01895fe5f056eee04ffe240fc01124adffd0e3fe9e55e5389a64f2175ebd192e Apr 23 08:47:36.432381 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.432357 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d506f42_bf26_438f_96ee_d33036fd434d.slice/crio-18e34dc3deaf85d1673a4387c3ce6eaca2112e4b14dbf7d7b8c7023882bc6c17 WatchSource:0}: Error finding container 18e34dc3deaf85d1673a4387c3ce6eaca2112e4b14dbf7d7b8c7023882bc6c17: Status 404 returned error can't find the container with id 18e34dc3deaf85d1673a4387c3ce6eaca2112e4b14dbf7d7b8c7023882bc6c17 Apr 23 08:47:36.433850 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.432738 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce69dc2d_3bc0_419d_b6d3_c42290984071.slice/crio-b568101bfd211ff2feb9dc91fc322fdee050e8d00283ff15c3b5cc85be24d2c1 WatchSource:0}: Error finding container b568101bfd211ff2feb9dc91fc322fdee050e8d00283ff15c3b5cc85be24d2c1: Status 404 returned error can't find the container with id b568101bfd211ff2feb9dc91fc322fdee050e8d00283ff15c3b5cc85be24d2c1 Apr 23 08:47:36.435014 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.434527 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c9848f_46ab_4976_a1d7_0f02cd542da7.slice/crio-599358c2d015e7a511adb9669d197388fcf48c50ee06eff8fdffaccadf77c8d6 WatchSource:0}: Error finding container 599358c2d015e7a511adb9669d197388fcf48c50ee06eff8fdffaccadf77c8d6: Status 404 returned error can't find the container with id 599358c2d015e7a511adb9669d197388fcf48c50ee06eff8fdffaccadf77c8d6 Apr 23 08:47:36.435391 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:47:36.435366 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b4f3610_c641_4979_b0b6_27f27e89efe0.slice/crio-4fad4100b84df408cd420b715c836e68d2b7352808d7450e3d7be950de58a81d WatchSource:0}: Error finding container 4fad4100b84df408cd420b715c836e68d2b7352808d7450e3d7be950de58a81d: Status 404 returned error can't find the container with id 4fad4100b84df408cd420b715c836e68d2b7352808d7450e3d7be950de58a81d Apr 23 08:47:36.587723 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.587686 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:42:34 +0000 UTC" deadline="2027-09-17 06:39:38.101656775 +0000 UTC" Apr 23 08:47:36.587723 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.587717 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12285h52m1.513942731s" Apr 23 08:47:36.615193 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.615158 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" event={"ID":"321aa24e-593f-4b13-bd8d-31c635ce80f5","Type":"ContainerStarted","Data":"349dbf7e13351a4f6845cd992370e334cf725290f9843ba1bc396cd41b94239b"} Apr 23 08:47:36.616107 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.616076 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" event={"ID":"d7eddd42-5408-4933-957d-9e3947f66e44","Type":"ContainerStarted","Data":"c47889eb253fec7dc04cedfd547dd116ca2fa359462254391bcac6f7e55ff582"} Apr 23 08:47:36.617010 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.616981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lh64p" event={"ID":"53bacb41-fc36-4333-8726-ced662fc325c","Type":"ContainerStarted","Data":"6b9ee418b732d8194c521096011e8c1376f533f6c4bd2e01621c1fcb6abf997b"} Apr 23 08:47:36.617905 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.617873 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" event={"ID":"0b4f3610-c641-4979-b0b6-27f27e89efe0","Type":"ContainerStarted","Data":"4fad4100b84df408cd420b715c836e68d2b7352808d7450e3d7be950de58a81d"} Apr 23 08:47:36.618724 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.618703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kqtvw" event={"ID":"b8c9848f-46ab-4976-a1d7-0f02cd542da7","Type":"ContainerStarted","Data":"599358c2d015e7a511adb9669d197388fcf48c50ee06eff8fdffaccadf77c8d6"} Apr 23 08:47:36.619663 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.619635 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xlcxj" event={"ID":"ce69dc2d-3bc0-419d-b6d3-c42290984071","Type":"ContainerStarted","Data":"b568101bfd211ff2feb9dc91fc322fdee050e8d00283ff15c3b5cc85be24d2c1"} Apr 23 08:47:36.620657 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.620636 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8jczv" event={"ID":"4d506f42-bf26-438f-96ee-d33036fd434d","Type":"ContainerStarted","Data":"18e34dc3deaf85d1673a4387c3ce6eaca2112e4b14dbf7d7b8c7023882bc6c17"} Apr 23 08:47:36.621592 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.621573 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z6vqc" event={"ID":"18df44e5-6990-4d9e-89c1-25b5e449602b","Type":"ContainerStarted","Data":"01895fe5f056eee04ffe240fc01124adffd0e3fe9e55e5389a64f2175ebd192e"} Apr 23 08:47:36.622538 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.622514 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"f64579ec86bb3f1e466e48186d206e053c4d22d551cca3e15361f660b772d4ea"} Apr 23 08:47:36.624138 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:36.624116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" event={"ID":"99512a33fd32ce0e80258f2ae365c428","Type":"ContainerStarted","Data":"e496f96f1533688e567c65cb3560517806a839852c60a66f92701b9a9cfeec53"} Apr 23 08:47:37.163685 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:37.163646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:37.164396 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.163884 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:47:37.164396 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.163908 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:47:37.164396 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.163921 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cxjq5 for pod openshift-network-diagnostics/network-check-target-7lkbc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:37.164396 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.163989 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5 podName:d41d976f-3b2d-457c-b9ac-f655f79bc0b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:39.16396954 +0000 UTC m=+6.143506337 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxjq5" (UniqueName: "kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5") pod "network-check-target-7lkbc" (UID: "d41d976f-3b2d-457c-b9ac-f655f79bc0b2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:37.265325 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:37.264571 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:37.265325 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:37.264650 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:37.265325 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.264778 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:37.265325 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.264841 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:47:39.264823162 +0000 UTC m=+6.244359971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:37.265325 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.265248 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:37.265325 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.265297 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret podName:7c7ca0ff-2599-42db-a76b-05f1bb8b8d60 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:39.265280217 +0000 UTC m=+6.244817010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret") pod "global-pull-secret-syncer-dcmzb" (UID: "7c7ca0ff-2599-42db-a76b-05f1bb8b8d60") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:37.608742 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:37.608695 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:37.609224 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.608839 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:37.609400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:37.609379 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:37.609519 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.609498 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:37.609623 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:37.609586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:37.609773 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:37.609752 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:38.647094 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:38.646236 2579 generic.go:358] "Generic (PLEG): container finished" podID="9c4be9192100d7e0e31845a4171d4170" containerID="19f730bd0d10b96a9e0260b34c5048e602651f0a7dcd6475f8e8f3a97deacb06" exitCode=0 Apr 23 08:47:38.647094 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:38.646314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" event={"ID":"9c4be9192100d7e0e31845a4171d4170","Type":"ContainerDied","Data":"19f730bd0d10b96a9e0260b34c5048e602651f0a7dcd6475f8e8f3a97deacb06"} Apr 23 08:47:38.665121 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:38.663781 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-232.ec2.internal" podStartSLOduration=4.663757566 podStartE2EDuration="4.663757566s" podCreationTimestamp="2026-04-23 08:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:47:36.647190432 +0000 UTC m=+3.626727248" watchObservedRunningTime="2026-04-23 08:47:38.663757566 +0000 UTC m=+5.643294383" Apr 23 08:47:39.182576 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:39.182481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:39.182853 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.182691 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:47:39.182853 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.182712 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:47:39.182853 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.182731 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cxjq5 for pod openshift-network-diagnostics/network-check-target-7lkbc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:39.182853 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.182789 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5 podName:d41d976f-3b2d-457c-b9ac-f655f79bc0b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:43.182770833 +0000 UTC m=+10.162307641 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxjq5" (UniqueName: "kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5") pod "network-check-target-7lkbc" (UID: "d41d976f-3b2d-457c-b9ac-f655f79bc0b2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:39.283240 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:39.283199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:39.283437 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:39.283281 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:39.283437 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.283433 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:39.283553 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.283493 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:47:43.283476014 +0000 UTC m=+10.263012810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:39.283937 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.283908 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:39.284038 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.283976 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret podName:7c7ca0ff-2599-42db-a76b-05f1bb8b8d60 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:43.283961099 +0000 UTC m=+10.263497896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret") pod "global-pull-secret-syncer-dcmzb" (UID: "7c7ca0ff-2599-42db-a76b-05f1bb8b8d60") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:39.607375 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:39.606330 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:39.607375 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.606480 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:39.607375 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:39.607164 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:39.607375 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:39.607181 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:39.607375 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.607261 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:39.607375 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:39.607342 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:41.605506 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:41.605362 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:41.605506 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:41.605383 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:41.605982 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:41.605509 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:41.605982 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:41.605577 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:41.605982 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:41.605671 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:41.605982 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:41.605803 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:43.217026 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:43.216380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:43.217026 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.216590 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:47:43.217026 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.216612 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:47:43.217026 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.216625 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cxjq5 for pod openshift-network-diagnostics/network-check-target-7lkbc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:43.217026 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.216684 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5 podName:d41d976f-3b2d-457c-b9ac-f655f79bc0b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:51.216664172 +0000 UTC m=+18.196200978 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxjq5" (UniqueName: "kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5") pod "network-check-target-7lkbc" (UID: "d41d976f-3b2d-457c-b9ac-f655f79bc0b2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:43.318010 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:43.317234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:43.318010 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:43.317318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:43.318010 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.317460 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:43.318010 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.317523 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret podName:7c7ca0ff-2599-42db-a76b-05f1bb8b8d60 nodeName:}" failed. No retries permitted until 2026-04-23 08:47:51.317505267 +0000 UTC m=+18.297042071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret") pod "global-pull-secret-syncer-dcmzb" (UID: "7c7ca0ff-2599-42db-a76b-05f1bb8b8d60") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:43.318010 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.317920 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:43.318010 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.317974 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:47:51.31795805 +0000 UTC m=+18.297494846 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:43.607372 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:43.607340 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:43.607538 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.607465 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:43.607810 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:43.607790 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:43.607922 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.607901 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:43.608998 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:43.608977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:43.609123 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:43.609089 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:45.606182 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:45.606142 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:45.606182 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:45.606170 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:45.606685 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:45.606142 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:45.606685 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:45.606450 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:45.606685 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:45.606560 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:45.606685 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:45.606670 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:47.605625 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:47.605591 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:47.605625 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:47.605619 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:47.606209 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:47.605590 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:47.606209 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:47.605727 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:47.606209 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:47.605809 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:47.606209 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:47.605900 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:49.606384 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:49.606348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:49.606862 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:49.606348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:49.606862 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:49.606473 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:49.606862 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:49.606348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:49.606862 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:49.606570 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:49.606862 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:49.606619 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:51.276824 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:51.276779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:51.277251 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.276966 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:47:51.277251 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.276993 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:47:51.277251 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.277008 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cxjq5 for pod openshift-network-diagnostics/network-check-target-7lkbc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:51.277251 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.277082 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5 podName:d41d976f-3b2d-457c-b9ac-f655f79bc0b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:07.277061511 +0000 UTC m=+34.256598308 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxjq5" (UniqueName: "kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5") pod "network-check-target-7lkbc" (UID: "d41d976f-3b2d-457c-b9ac-f655f79bc0b2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:47:51.378168 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:51.378127 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:51.378346 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:51.378208 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:51.378346 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.378263 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:51.378346 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.378319 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:51.378346 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.378346 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret podName:7c7ca0ff-2599-42db-a76b-05f1bb8b8d60 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:07.378322256 +0000 UTC m=+34.357859056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret") pod "global-pull-secret-syncer-dcmzb" (UID: "7c7ca0ff-2599-42db-a76b-05f1bb8b8d60") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:47:51.378565 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.378372 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:48:07.37835458 +0000 UTC m=+34.357891375 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:47:51.606342 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:51.606205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:51.606342 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:51.606269 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:51.606551 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.606392 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:51.606551 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.606474 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:51.606551 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:51.606534 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:51.606686 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:51.606610 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:53.609344 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.609078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:53.609344 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.609309 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:53.609997 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.609304 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:53.609997 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:53.609410 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:53.609997 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:53.609493 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:53.609997 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:53.609570 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:53.674889 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.674855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8jczv" event={"ID":"4d506f42-bf26-438f-96ee-d33036fd434d","Type":"ContainerStarted","Data":"bfb7530e4ba3225a7c2907c94fbe869ea61e1356a1625661f083960e96360d85"} Apr 23 08:47:53.676622 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.676591 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" event={"ID":"9c4be9192100d7e0e31845a4171d4170","Type":"ContainerStarted","Data":"5f41684d04b49c3a02eb36d42ec5a293d44a79ab201b3c05068dc4bdcc67eea8"} Apr 23 08:47:53.677875 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.677839 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" event={"ID":"321aa24e-593f-4b13-bd8d-31c635ce80f5","Type":"ContainerStarted","Data":"2860d492bc0e898780b7d3de76a49e7298d9f35537fe07d8d52afc3604026cdc"} Apr 23 08:47:53.679207 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.679186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" event={"ID":"d7eddd42-5408-4933-957d-9e3947f66e44","Type":"ContainerStarted","Data":"ae6692035a54204b36b05fc12a5ce572d743eaa333221f59b885a54312cf6dde"} Apr 23 08:47:53.680702 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.680677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lh64p" event={"ID":"53bacb41-fc36-4333-8726-ced662fc325c","Type":"ContainerStarted","Data":"c3c08e40142d5357c9f15a344ac4b5a71c203595f2bdfaea59af56eb0095af32"} Apr 23 08:47:53.682386 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.682366 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kqtvw" event={"ID":"b8c9848f-46ab-4976-a1d7-0f02cd542da7","Type":"ContainerStarted","Data":"c02641946283d341b3513817670f8d7dfc00f63900bd6123df117cb0fae83031"} Apr 23 08:47:53.683988 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.683969 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xlcxj" event={"ID":"ce69dc2d-3bc0-419d-b6d3-c42290984071","Type":"ContainerStarted","Data":"411d186002e55fcf888bc22786b51d0cc23e03714cdd904d7883abbe694871ad"} Apr 23 08:47:53.689608 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.689573 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8jczv" podStartSLOduration=3.833268717 podStartE2EDuration="20.689563069s" podCreationTimestamp="2026-04-23 08:47:33 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.434980995 +0000 UTC m=+3.414517789" lastFinishedPulling="2026-04-23 08:47:53.291275344 +0000 UTC m=+20.270812141" observedRunningTime="2026-04-23 08:47:53.689447592 +0000 UTC m=+20.668984407" watchObservedRunningTime="2026-04-23 08:47:53.689563069 +0000 UTC m=+20.669099885" Apr 23 08:47:53.707739 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.707572 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8x5bw" podStartSLOduration=2.846914728 podStartE2EDuration="19.707558656s" podCreationTimestamp="2026-04-23 08:47:34 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.432581796 +0000 UTC m=+3.412118592" lastFinishedPulling="2026-04-23 08:47:53.293225713 +0000 UTC m=+20.272762520" observedRunningTime="2026-04-23 08:47:53.707164049 +0000 UTC m=+20.686700867" watchObservedRunningTime="2026-04-23 08:47:53.707558656 +0000 UTC m=+20.687095483" Apr 23 08:47:53.719820 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.719759 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kqtvw" podStartSLOduration=3.865228959 podStartE2EDuration="20.719737731s" podCreationTimestamp="2026-04-23 08:47:33 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.43711464 +0000 UTC m=+3.416651438" lastFinishedPulling="2026-04-23 08:47:53.291623409 +0000 UTC m=+20.271160210" observedRunningTime="2026-04-23 08:47:53.719439958 +0000 UTC m=+20.698976773" watchObservedRunningTime="2026-04-23 08:47:53.719737731 +0000 UTC m=+20.699274549" Apr 23 08:47:53.736101 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.736023 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xlcxj" podStartSLOduration=7.381391117 podStartE2EDuration="19.736001267s" podCreationTimestamp="2026-04-23 08:47:34 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.435324675 +0000 UTC m=+3.414861471" lastFinishedPulling="2026-04-23 08:47:48.789934828 +0000 UTC m=+15.769471621" observedRunningTime="2026-04-23 08:47:53.735543181 +0000 UTC m=+20.715079996" watchObservedRunningTime="2026-04-23 08:47:53.736001267 +0000 UTC m=+20.715538087" Apr 23 08:47:53.761867 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:53.761820 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-232.ec2.internal" podStartSLOduration=19.76180424 podStartE2EDuration="19.76180424s" podCreationTimestamp="2026-04-23 08:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:47:53.761361002 +0000 UTC m=+20.740897819" watchObservedRunningTime="2026-04-23 08:47:53.76180424 +0000 UTC m=+20.741341058" Apr 23 08:47:54.687650 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.687463 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b4f3610-c641-4979-b0b6-27f27e89efe0" containerID="1be8bfbf65790d6c2a6963a67e044f543c429af94fba9093e8d445c6e1b2f558" exitCode=0 Apr 23 08:47:54.688267 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.687547 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" event={"ID":"0b4f3610-c641-4979-b0b6-27f27e89efe0","Type":"ContainerDied","Data":"1be8bfbf65790d6c2a6963a67e044f543c429af94fba9093e8d445c6e1b2f558"} Apr 23 08:47:54.690364 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.690330 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"e7c634ed730ab4f67927197d93a54d43ebca93332300841d88aa51f5ce40641c"} Apr 23 08:47:54.690510 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.690374 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"e37090dcfa4806090cc0b0103fb50a4fd181512b995b38e56ac42cd5b8b156b0"} Apr 23 08:47:54.690510 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.690385 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"0af57a909063941a594315ebc4014d0319c10e7300e7a8cf604680162636f079"} Apr 23 08:47:54.690510 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.690394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"8a274938c12670c19bdbccff9096c18ceaee2f908ae74b0059907a17690b9ceb"} Apr 23 08:47:54.690510 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.690401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"a657568635ccb1e5f60d9446fed63465c909c537e2321abe5d2fef6a5c6f8e0b"} Apr 23 08:47:54.690510 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.690410 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"86fa1ebd9f3b4475a63b6e2046eb5e1686b2a30df02c27776f84fd8649459c0f"} Apr 23 08:47:54.707836 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.707786 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lh64p" podStartSLOduration=4.797302244 podStartE2EDuration="21.707769755s" podCreationTimestamp="2026-04-23 08:47:33 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.429670516 +0000 UTC m=+3.409207311" lastFinishedPulling="2026-04-23 08:47:53.340138021 +0000 UTC m=+20.319674822" observedRunningTime="2026-04-23 08:47:53.789884115 +0000 UTC m=+20.769420931" watchObservedRunningTime="2026-04-23 08:47:54.707769755 +0000 UTC m=+21.687306579" Apr 23 08:47:54.925245 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:54.925218 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:47:55.606356 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.606144 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:55.606623 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.606194 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:55.606623 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:55.606442 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:55.606623 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:55.606506 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:55.606623 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.606219 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:55.606623 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:55.606594 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:55.614217 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.614080 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:47:54.925237728Z","UUID":"6bd5de22-f998-4ddf-83ce-c818d310382f","Handler":null,"Name":"","Endpoint":""} Apr 23 08:47:55.616026 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.616005 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:47:55.616026 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.616032 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:47:55.694793 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.694753 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" event={"ID":"d7eddd42-5408-4933-957d-9e3947f66e44","Type":"ContainerStarted","Data":"0f3efe83f9162114ba32e379a59d1439665037999540ee9b5a3dde066584f22b"} Apr 23 08:47:55.696647 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.696346 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z6vqc" event={"ID":"18df44e5-6990-4d9e-89c1-25b5e449602b","Type":"ContainerStarted","Data":"f7768d37361c56702d9611e60d11d61bb19cdc117e2e714bdee46a910aaaea2d"} Apr 23 08:47:55.722644 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:55.722583 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z6vqc" podStartSLOduration=5.865380334 podStartE2EDuration="22.722562989s" podCreationTimestamp="2026-04-23 08:47:33 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.43408297 +0000 UTC m=+3.413619764" lastFinishedPulling="2026-04-23 08:47:53.291265622 +0000 UTC m=+20.270802419" observedRunningTime="2026-04-23 08:47:55.721813892 +0000 UTC m=+22.701350707" watchObservedRunningTime="2026-04-23 08:47:55.722562989 +0000 UTC m=+22.702099806" Apr 23 08:47:56.700777 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:56.700398 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" event={"ID":"d7eddd42-5408-4933-957d-9e3947f66e44","Type":"ContainerStarted","Data":"c549abf7483a508bf7e7616d4ca9f468fead412cb6d2620f7a2ced472f3d24dc"} Apr 23 08:47:56.703975 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:56.703929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"f54de44b1e6fb4c0deb34ba33c18f25edf8c35fc31164804d9a9e2a5363e2385"} Apr 23 08:47:56.720596 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:56.720543 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtfbv" podStartSLOduration=4.213461691 podStartE2EDuration="23.720527266s" podCreationTimestamp="2026-04-23 08:47:33 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.431693455 +0000 UTC m=+3.411230250" lastFinishedPulling="2026-04-23 08:47:55.93875902 +0000 UTC m=+22.918295825" observedRunningTime="2026-04-23 08:47:56.720085705 +0000 UTC m=+23.699622521" watchObservedRunningTime="2026-04-23 08:47:56.720527266 +0000 UTC m=+23.700064082" Apr 23 08:47:57.605522 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:57.605478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:57.605703 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:57.605599 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:57.605703 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:57.605485 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:57.605703 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:57.605478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:57.605836 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:57.605704 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:57.605836 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:57.605815 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:58.067464 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:58.067425 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:58.068282 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:58.068193 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:58.707469 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:58.707429 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:58.708278 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:58.708253 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xlcxj" Apr 23 08:47:59.606296 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.606103 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:47:59.606837 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.606103 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:47:59.606837 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:59.606387 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:47:59.606837 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:59.606477 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:47:59.606837 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.606103 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:47:59.606837 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:47:59.606589 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:47:59.711223 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.711188 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b4f3610-c641-4979-b0b6-27f27e89efe0" containerID="8911721979b00f984a57e1cd2f974a670c119f330e8b95335f969ef7f24c727e" exitCode=0 Apr 23 08:47:59.711372 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.711277 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" event={"ID":"0b4f3610-c641-4979-b0b6-27f27e89efe0","Type":"ContainerDied","Data":"8911721979b00f984a57e1cd2f974a670c119f330e8b95335f969ef7f24c727e"} Apr 23 08:47:59.714741 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.714715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" event={"ID":"ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a","Type":"ContainerStarted","Data":"31600b895812f53973714afff01df6e0bfa027bfbe305805f3c3ab9ab25d72c2"} Apr 23 08:47:59.714962 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.714946 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:59.715032 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.714973 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:59.715032 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.714988 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:59.730144 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.730119 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:59.735462 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.735444 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:47:59.766111 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:47:59.766038 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" podStartSLOduration=9.583113127 podStartE2EDuration="26.766023918s" podCreationTimestamp="2026-04-23 08:47:33 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.425800563 +0000 UTC m=+3.405337357" lastFinishedPulling="2026-04-23 08:47:53.608711334 +0000 UTC m=+20.588248148" observedRunningTime="2026-04-23 08:47:59.764264504 +0000 UTC m=+26.743801321" watchObservedRunningTime="2026-04-23 08:47:59.766023918 +0000 UTC m=+26.745560733" Apr 23 08:48:01.204819 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:01.204738 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7lkbc"] Apr 23 08:48:01.205186 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:01.204858 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:01.205186 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:01.204937 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:48:01.207703 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:01.207677 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dcmzb"] Apr 23 08:48:01.207828 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:01.207797 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:01.207955 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:01.207924 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:48:01.208503 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:01.208457 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-msghc"] Apr 23 08:48:01.208600 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:01.208579 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:48:01.208708 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:01.208687 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:48:01.719756 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:01.719719 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b4f3610-c641-4979-b0b6-27f27e89efe0" containerID="ed1f3f12a356d7657620bbf224faa18b42012027b41e6326b3b13d3ce3a6ced1" exitCode=0 Apr 23 08:48:01.719922 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:01.719809 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" event={"ID":"0b4f3610-c641-4979-b0b6-27f27e89efe0","Type":"ContainerDied","Data":"ed1f3f12a356d7657620bbf224faa18b42012027b41e6326b3b13d3ce3a6ced1"} Apr 23 08:48:02.606292 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:02.606260 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:48:02.606658 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:02.606260 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:02.606658 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:02.606378 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:48:02.606658 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:02.606428 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:48:02.606658 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:02.606259 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:02.606658 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:02.606538 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:48:03.725250 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:03.725029 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b4f3610-c641-4979-b0b6-27f27e89efe0" containerID="af7aefffec8713a77d26e7b8689ac352f2ffc25bd18cdebf9b1ad9660f52c233" exitCode=0 Apr 23 08:48:03.725250 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:03.725112 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" event={"ID":"0b4f3610-c641-4979-b0b6-27f27e89efe0","Type":"ContainerDied","Data":"af7aefffec8713a77d26e7b8689ac352f2ffc25bd18cdebf9b1ad9660f52c233"} Apr 23 08:48:04.606036 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:04.606000 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:04.606228 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:04.606000 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:48:04.606228 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:04.606136 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:48:04.606228 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:04.606000 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:04.606228 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:04.606220 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:48:04.606426 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:04.606339 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:48:06.606243 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:06.606208 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:06.606243 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:06.606237 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:48:06.606724 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:06.606328 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dcmzb" podUID="7c7ca0ff-2599-42db-a76b-05f1bb8b8d60" Apr 23 08:48:06.606724 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:06.606410 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:06.606724 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:06.606411 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:48:06.606724 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:06.606498 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7lkbc" podUID="d41d976f-3b2d-457c-b9ac-f655f79bc0b2" Apr 23 08:48:07.271539 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.271512 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-232.ec2.internal" event="NodeReady" Apr 23 08:48:07.271766 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.271653 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:48:07.299012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.298978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:07.299196 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.299147 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:07.299196 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.299166 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:07.299196 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.299178 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cxjq5 for pod openshift-network-diagnostics/network-check-target-7lkbc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:07.299297 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.299228 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5 podName:d41d976f-3b2d-457c-b9ac-f655f79bc0b2 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:39.299214561 +0000 UTC m=+66.278751355 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxjq5" (UniqueName: "kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5") pod "network-check-target-7lkbc" (UID: "d41d976f-3b2d-457c-b9ac-f655f79bc0b2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:07.312709 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.312675 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f4458988b-jshml"] Apr 23 08:48:07.326436 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.326407 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9"] Apr 23 08:48:07.326640 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.326598 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.329554 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.329457 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:48:07.329554 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.329476 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:48:07.329894 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.329847 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:48:07.329894 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.329865 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpxkg\"" Apr 23 08:48:07.338697 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.337638 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:48:07.344918 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.344686 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45"] Apr 23 08:48:07.345121 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.344980 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.347703 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.347681 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 08:48:07.347992 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.347974 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 08:48:07.348139 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.348083 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 08:48:07.348210 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.347984 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 08:48:07.362543 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.362514 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll"] Apr 23 08:48:07.362721 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.362684 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.365424 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.365403 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 08:48:07.365557 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.365466 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 08:48:07.365809 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.365791 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 08:48:07.365899 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.365860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 08:48:07.381482 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.381449 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2vhbm"] Apr 23 08:48:07.381645 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.381528 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" Apr 23 08:48:07.384241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.384218 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 08:48:07.384356 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.384284 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-tn2xp\"" Apr 23 08:48:07.399446 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.399417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:07.399612 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.399478 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:48:07.399612 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.399602 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:07.399729 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.399626 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9"] Apr 23 08:48:07.399729 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.399632 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:07.399729 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.399651 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f4458988b-jshml"] Apr 23 08:48:07.399729 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.399662 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:48:39.399644391 +0000 UTC m=+66.379181200 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:07.399729 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.399666 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45"] Apr 23 08:48:07.399729 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.399681 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8x8sn"] Apr 23 08:48:07.399729 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.399728 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret podName:7c7ca0ff-2599-42db-a76b-05f1bb8b8d60 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:39.399709139 +0000 UTC m=+66.379245948 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret") pod "global-pull-secret-syncer-dcmzb" (UID: "7c7ca0ff-2599-42db-a76b-05f1bb8b8d60") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:07.400056 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.399767 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.403480 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.403456 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:48:07.403617 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.403597 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:48:07.403715 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.403699 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hskmq\"" Apr 23 08:48:07.414921 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.414892 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2vhbm"] Apr 23 08:48:07.414921 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.414927 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll"] Apr 23 08:48:07.415146 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.414944 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8x8sn"] Apr 23 08:48:07.415146 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.415085 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:07.418266 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.418228 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:48:07.418392 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.418326 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:48:07.418475 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.418454 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:48:07.421687 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.421664 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-55hsm\"" Apr 23 08:48:07.500157 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-ca-trust-extracted\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.500348 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500165 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-trusted-ca\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.500348 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500192 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-installation-pull-secrets\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.500348 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500227 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-bound-sa-token\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.500348 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-ca\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.500348 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.500348 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500311 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx548\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-kube-api-access-gx548\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.500348 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500341 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-tmp\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500370 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-klusterlet-config\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsgqw\" (UniqueName: \"kubernetes.io/projected/492bafe0-e29c-40bf-ae28-09c4e7b2e6c5-kube-api-access-hsgqw\") pod \"managed-serviceaccount-addon-agent-5dd44fcd45-t8bll\" (UID: \"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500426 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/87fd1d2e-4f49-403b-8134-81b4b406186c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13dbe402-f1f7-465b-bb2a-cea6503b73fb-config-volume\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500477 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-hub\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w47wl\" (UniqueName: \"kubernetes.io/projected/68bcbcf4-4715-4a78-9508-3415c39698dc-kube-api-access-w47wl\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500636 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-certificates\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.500667 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500656 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13dbe402-f1f7-465b-bb2a-cea6503b73fb-tmp-dir\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.500994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnltd\" (UniqueName: \"kubernetes.io/projected/87fd1d2e-4f49-403b-8134-81b4b406186c-kube-api-access-pnltd\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.500994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500796 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/492bafe0-e29c-40bf-ae28-09c4e7b2e6c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5dd44fcd45-t8bll\" (UID: \"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" Apr 23 08:48:07.500994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-image-registry-private-configuration\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.500994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.500994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500903 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2ll9\" (UniqueName: \"kubernetes.io/projected/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-kube-api-access-r2ll9\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.500994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.500994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:07.500994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.500958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpg7\" (UniqueName: \"kubernetes.io/projected/13dbe402-f1f7-465b-bb2a-cea6503b73fb-kube-api-access-kmpg7\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.601732 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.601646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-hub\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.601732 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.601684 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w47wl\" (UniqueName: \"kubernetes.io/projected/68bcbcf4-4715-4a78-9508-3415c39698dc-kube-api-access-w47wl\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:07.601732 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.601732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-certificates\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.601992 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.601759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13dbe402-f1f7-465b-bb2a-cea6503b73fb-tmp-dir\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.602067 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602001 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnltd\" (UniqueName: \"kubernetes.io/projected/87fd1d2e-4f49-403b-8134-81b4b406186c-kube-api-access-pnltd\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.602128 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/492bafe0-e29c-40bf-ae28-09c4e7b2e6c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5dd44fcd45-t8bll\" (UID: \"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" Apr 23 08:48:07.602195 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602134 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-image-registry-private-configuration\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.602195 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602163 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.602195 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13dbe402-f1f7-465b-bb2a-cea6503b73fb-tmp-dir\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.602337 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.602275 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:48:07.602337 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.602289 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4458988b-jshml: secret "image-registry-tls" not found Apr 23 08:48:07.602337 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2ll9\" (UniqueName: \"kubernetes.io/projected/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-kube-api-access-r2ll9\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.602511 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.602340 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls podName:2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:08.102323333 +0000 UTC m=+35.081860143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls") pod "image-registry-6f4458988b-jshml" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045") : secret "image-registry-tls" not found Apr 23 08:48:07.602511 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.602511 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:07.602511 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpg7\" (UniqueName: \"kubernetes.io/projected/13dbe402-f1f7-465b-bb2a-cea6503b73fb-kube-api-access-kmpg7\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.602511 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-ca-trust-extracted\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.602511 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602473 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-certificates\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.602817 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-trusted-ca\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.602817 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.602664 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:07.602817 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602686 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-installation-pull-secrets\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.602817 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-bound-sa-token\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.602817 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.602732 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:48:08.102716307 +0000 UTC m=+35.082253125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:48:07.602817 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-ca-trust-extracted\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.602817 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-ca\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.603183 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.603183 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx548\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-kube-api-access-gx548\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.603183 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-tmp\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.603183 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.602926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-klusterlet-config\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.604090 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.603615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-trusted-ca\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.607397 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-image-registry-private-configuration\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.607438 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/492bafe0-e29c-40bf-ae28-09c4e7b2e6c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5dd44fcd45-t8bll\" (UID: \"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.607443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.607513 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsgqw\" (UniqueName: \"kubernetes.io/projected/492bafe0-e29c-40bf-ae28-09c4e7b2e6c5-kube-api-access-hsgqw\") pod \"managed-serviceaccount-addon-agent-5dd44fcd45-t8bll\" (UID: \"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.607546 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/87fd1d2e-4f49-403b-8134-81b4b406186c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.607575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13dbe402-f1f7-465b-bb2a-cea6503b73fb-config-volume\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.607600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.607643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.607816 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:07.608074 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:07.607869 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:48:08.107851465 +0000 UTC m=+35.087388262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:48:07.609117 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.608173 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-hub\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.609117 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.608297 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-tmp\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.609117 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.608577 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/87fd1d2e-4f49-403b-8134-81b4b406186c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.609117 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.608659 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-installation-pull-secrets\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.609117 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.608834 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-klusterlet-config\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.610411 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.610384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/87fd1d2e-4f49-403b-8134-81b4b406186c-ca\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.611714 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.611534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w47wl\" (UniqueName: \"kubernetes.io/projected/68bcbcf4-4715-4a78-9508-3415c39698dc-kube-api-access-w47wl\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:07.612619 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.612591 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnltd\" (UniqueName: \"kubernetes.io/projected/87fd1d2e-4f49-403b-8134-81b4b406186c-kube-api-access-pnltd\") pod \"cluster-proxy-proxy-agent-7d74b4fcd6-djg45\" (UID: \"87fd1d2e-4f49-403b-8134-81b4b406186c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.612853 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.612822 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-bound-sa-token\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.613349 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.613330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13dbe402-f1f7-465b-bb2a-cea6503b73fb-config-volume\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.613583 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.613555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpg7\" (UniqueName: \"kubernetes.io/projected/13dbe402-f1f7-465b-bb2a-cea6503b73fb-kube-api-access-kmpg7\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:07.613989 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.613965 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx548\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-kube-api-access-gx548\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:07.614301 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.614282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2ll9\" (UniqueName: \"kubernetes.io/projected/c761ccf7-a9a1-4de6-9eb7-f86af4f698ae-kube-api-access-r2ll9\") pod \"klusterlet-addon-workmgr-6db75f98cd-bcmc9\" (UID: \"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.615850 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.615829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsgqw\" (UniqueName: \"kubernetes.io/projected/492bafe0-e29c-40bf-ae28-09c4e7b2e6c5-kube-api-access-hsgqw\") pod \"managed-serviceaccount-addon-agent-5dd44fcd45-t8bll\" (UID: \"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" Apr 23 08:48:07.656240 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.656199 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:07.686177 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.686137 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:48:07.695975 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:07.695945 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" Apr 23 08:48:08.112524 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.112484 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:08.112800 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.112590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:08.112800 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.112618 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:08.112800 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:08.112661 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:08.112800 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:08.112731 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:08.112800 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:08.112740 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:48:08.112800 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:08.112759 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4458988b-jshml: secret "image-registry-tls" not found Apr 23 08:48:08.112800 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:08.112782 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:48:09.112761807 +0000 UTC m=+36.092298609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:48:08.112800 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:08.112805 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:48:09.112794677 +0000 UTC m=+36.092331507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:48:08.113242 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:08.112822 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls podName:2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:09.112813158 +0000 UTC m=+36.092349953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls") pod "image-registry-6f4458988b-jshml" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045") : secret "image-registry-tls" not found Apr 23 08:48:08.605758 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.605722 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:48:08.606058 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.605719 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:08.606058 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.605719 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:08.608640 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.608618 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:48:08.608994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.608721 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:48:08.609990 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.609968 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zffjs\"" Apr 23 08:48:08.609990 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.609982 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h75n2\"" Apr 23 08:48:08.610172 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.609994 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:48:08.610172 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:08.610097 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:48:09.122760 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.122723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:09.122760 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.122765 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:09.123130 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.122801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:09.123130 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:09.122881 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:48:09.123130 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:09.122906 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:09.123130 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:09.122912 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4458988b-jshml: secret "image-registry-tls" not found Apr 23 08:48:09.123130 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:09.122934 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:09.123130 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:09.122969 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:48:11.122951882 +0000 UTC m=+38.102488682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:48:09.123130 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:09.122982 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls podName:2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:11.122976756 +0000 UTC m=+38.102513549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls") pod "image-registry-6f4458988b-jshml" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045") : secret "image-registry-tls" not found Apr 23 08:48:09.123130 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:09.122992 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:48:11.122986751 +0000 UTC m=+38.102523545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:48:09.598939 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.598650 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll"] Apr 23 08:48:09.599816 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.599744 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9"] Apr 23 08:48:09.601383 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.601359 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45"] Apr 23 08:48:09.620975 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:48:09.620922 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod492bafe0_e29c_40bf_ae28_09c4e7b2e6c5.slice/crio-8632613bc377a663fc262f452f5785e54811e20fe781b63a7ace50b1d161aaf0 WatchSource:0}: Error finding container 8632613bc377a663fc262f452f5785e54811e20fe781b63a7ace50b1d161aaf0: Status 404 returned error can't find the container with id 8632613bc377a663fc262f452f5785e54811e20fe781b63a7ace50b1d161aaf0 Apr 23 08:48:09.622197 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:48:09.622125 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc761ccf7_a9a1_4de6_9eb7_f86af4f698ae.slice/crio-2c768b3092f55f2d957005a02201c576d460efae8bc200a83b49e460ce0a2849 WatchSource:0}: Error finding container 2c768b3092f55f2d957005a02201c576d460efae8bc200a83b49e460ce0a2849: Status 404 returned error can't find the container with id 2c768b3092f55f2d957005a02201c576d460efae8bc200a83b49e460ce0a2849 Apr 23 08:48:09.622544 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:48:09.622517 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87fd1d2e_4f49_403b_8134_81b4b406186c.slice/crio-07c55770744841cff820d78bd9c352f5feafd45bffa50144a5667a392c05bc19 WatchSource:0}: Error finding container 07c55770744841cff820d78bd9c352f5feafd45bffa50144a5667a392c05bc19: Status 404 returned error can't find the container with id 07c55770744841cff820d78bd9c352f5feafd45bffa50144a5667a392c05bc19 Apr 23 08:48:09.742826 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.742733 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" event={"ID":"87fd1d2e-4f49-403b-8134-81b4b406186c","Type":"ContainerStarted","Data":"07c55770744841cff820d78bd9c352f5feafd45bffa50144a5667a392c05bc19"} Apr 23 08:48:09.744211 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.744174 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" event={"ID":"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5","Type":"ContainerStarted","Data":"8632613bc377a663fc262f452f5785e54811e20fe781b63a7ace50b1d161aaf0"} Apr 23 08:48:09.745588 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:09.745557 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" event={"ID":"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae","Type":"ContainerStarted","Data":"2c768b3092f55f2d957005a02201c576d460efae8bc200a83b49e460ce0a2849"} Apr 23 08:48:10.754988 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:10.754949 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b4f3610-c641-4979-b0b6-27f27e89efe0" containerID="ac37e2f59a8a666059651b464bc8fd2a5ae470a2aadcbe4ea2b8cb31aa04134b" exitCode=0 Apr 23 08:48:10.755519 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:10.755030 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" event={"ID":"0b4f3610-c641-4979-b0b6-27f27e89efe0","Type":"ContainerDied","Data":"ac37e2f59a8a666059651b464bc8fd2a5ae470a2aadcbe4ea2b8cb31aa04134b"} Apr 23 08:48:11.141198 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:11.141147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:11.141198 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:11.141201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:11.141413 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:11.141257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:11.141413 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:11.141405 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:11.141528 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:11.141466 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:48:15.141446931 +0000 UTC m=+42.120983728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:48:11.141906 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:11.141884 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:48:11.141984 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:11.141909 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4458988b-jshml: secret "image-registry-tls" not found Apr 23 08:48:11.141984 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:11.141955 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls podName:2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:15.141938029 +0000 UTC m=+42.121474826 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls") pod "image-registry-6f4458988b-jshml" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045") : secret "image-registry-tls" not found Apr 23 08:48:11.142120 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:11.142012 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:11.142120 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:11.142058 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:48:15.142031435 +0000 UTC m=+42.121568235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:48:11.762920 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:11.761929 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b4f3610-c641-4979-b0b6-27f27e89efe0" containerID="acb88f32ecf34e160f26346a794566b1eae88f02f27f15078ff23efe63a12be4" exitCode=0 Apr 23 08:48:11.762920 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:11.762016 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" event={"ID":"0b4f3610-c641-4979-b0b6-27f27e89efe0","Type":"ContainerDied","Data":"acb88f32ecf34e160f26346a794566b1eae88f02f27f15078ff23efe63a12be4"} Apr 23 08:48:15.181034 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.180988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:15.181034 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.181033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:15.181562 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.181093 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:15.181562 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:15.181149 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:48:15.181562 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:15.181175 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4458988b-jshml: secret "image-registry-tls" not found Apr 23 08:48:15.181562 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:15.181192 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:15.181562 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:15.181229 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:15.181562 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:15.181234 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls podName:2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:23.18121757 +0000 UTC m=+50.160754364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls") pod "image-registry-6f4458988b-jshml" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045") : secret "image-registry-tls" not found Apr 23 08:48:15.181562 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:15.181298 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:48:23.181280464 +0000 UTC m=+50.160817263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:48:15.181562 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:15.181323 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:48:23.181312525 +0000 UTC m=+50.160849327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:48:15.772937 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.772895 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" event={"ID":"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae","Type":"ContainerStarted","Data":"b3a79dd976808c5e2d456bf33adec10db40e245033a2d4625a5bb442dd15c96a"} Apr 23 08:48:15.773158 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.773147 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:15.775075 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.775033 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:48:15.777158 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.776879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" event={"ID":"0b4f3610-c641-4979-b0b6-27f27e89efe0","Type":"ContainerStarted","Data":"15be10a29795be9a1ad2960575240d55a1cc6ecf3722a766558b398c7d37d20b"} Apr 23 08:48:15.778408 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.778380 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" event={"ID":"87fd1d2e-4f49-403b-8134-81b4b406186c","Type":"ContainerStarted","Data":"da34fe3293c38241f5ebeb67ff404abbb20a812309b4a4af495e794b8d7491d7"} Apr 23 08:48:15.779656 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.779632 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" event={"ID":"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5","Type":"ContainerStarted","Data":"38817ebc352938a6be3e0be5e746a05f2381abc4ed8ca106de184bfd5a4b21af"} Apr 23 08:48:15.789172 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.789122 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" podStartSLOduration=18.698797247 podStartE2EDuration="23.789107735s" podCreationTimestamp="2026-04-23 08:47:52 +0000 UTC" firstStartedPulling="2026-04-23 08:48:09.631521463 +0000 UTC m=+36.611058265" lastFinishedPulling="2026-04-23 08:48:14.721831955 +0000 UTC m=+41.701368753" observedRunningTime="2026-04-23 08:48:15.788881242 +0000 UTC m=+42.768418059" watchObservedRunningTime="2026-04-23 08:48:15.789107735 +0000 UTC m=+42.768644550" Apr 23 08:48:15.804274 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.804209 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" podStartSLOduration=18.733864464 podStartE2EDuration="23.804187352s" podCreationTimestamp="2026-04-23 08:47:52 +0000 UTC" firstStartedPulling="2026-04-23 08:48:09.631709029 +0000 UTC m=+36.611245823" lastFinishedPulling="2026-04-23 08:48:14.702031916 +0000 UTC m=+41.681568711" observedRunningTime="2026-04-23 08:48:15.803431944 +0000 UTC m=+42.782968761" watchObservedRunningTime="2026-04-23 08:48:15.804187352 +0000 UTC m=+42.783724166" Apr 23 08:48:15.824401 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:15.824342 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lr5f7" podStartSLOduration=9.606197213 podStartE2EDuration="42.824326629s" podCreationTimestamp="2026-04-23 08:47:33 +0000 UTC" firstStartedPulling="2026-04-23 08:47:36.437168122 +0000 UTC m=+3.416704916" lastFinishedPulling="2026-04-23 08:48:09.655297536 +0000 UTC m=+36.634834332" observedRunningTime="2026-04-23 08:48:15.822723559 +0000 UTC m=+42.802260377" watchObservedRunningTime="2026-04-23 08:48:15.824326629 +0000 UTC m=+42.803863444" Apr 23 08:48:17.785096 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:17.785036 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" event={"ID":"87fd1d2e-4f49-403b-8134-81b4b406186c","Type":"ContainerStarted","Data":"cf4941c313eff32d54adc5aeb3a1f56c434bac2af22be7d76e301eaf485a09df"} Apr 23 08:48:17.785096 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:17.785099 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" event={"ID":"87fd1d2e-4f49-403b-8134-81b4b406186c","Type":"ContainerStarted","Data":"1f552cf4b4cc931f8b017c6a2bd323220be347c33e4e74cc870115a2ec16d294"} Apr 23 08:48:17.806619 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:17.806577 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" podStartSLOduration=18.490457016 podStartE2EDuration="25.80656434s" podCreationTimestamp="2026-04-23 08:47:52 +0000 UTC" firstStartedPulling="2026-04-23 08:48:09.63152026 +0000 UTC m=+36.611057059" lastFinishedPulling="2026-04-23 08:48:16.947627577 +0000 UTC m=+43.927164383" observedRunningTime="2026-04-23 08:48:17.804992657 +0000 UTC m=+44.784529474" watchObservedRunningTime="2026-04-23 08:48:17.80656434 +0000 UTC m=+44.786101199" Apr 23 08:48:23.240101 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:23.240063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:23.240101 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:23.240101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:23.240579 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:23.240132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:23.240579 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:23.240206 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:48:23.240579 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:23.240225 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4458988b-jshml: secret "image-registry-tls" not found Apr 23 08:48:23.240579 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:23.240230 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:23.240579 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:23.240298 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls podName:2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:39.240284162 +0000 UTC m=+66.219820955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls") pod "image-registry-6f4458988b-jshml" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045") : secret "image-registry-tls" not found Apr 23 08:48:23.240579 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:23.240234 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:23.240579 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:23.240322 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:48:39.24030587 +0000 UTC m=+66.219842672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:48:23.240579 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:23.240372 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:48:39.240362666 +0000 UTC m=+66.219899460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:48:31.736535 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:31.736499 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ltg5" Apr 23 08:48:39.254282 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.254236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:48:39.254282 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.254275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:48:39.254689 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.254313 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:48:39.254689 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.254380 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:48:39.254689 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.254400 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4458988b-jshml: secret "image-registry-tls" not found Apr 23 08:48:39.254689 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.254408 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:39.254689 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.254434 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:39.254689 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.254456 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls podName:2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:11.254440332 +0000 UTC m=+98.233977126 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls") pod "image-registry-6f4458988b-jshml" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045") : secret "image-registry-tls" not found Apr 23 08:48:39.254689 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.254473 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:49:11.254461285 +0000 UTC m=+98.233998081 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:48:39.254689 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.254487 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:49:11.25448108 +0000 UTC m=+98.234017873 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:48:39.354960 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.354913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:39.357862 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.357845 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:48:39.367807 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.367785 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:48:39.379903 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.379872 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxjq5\" (UniqueName: \"kubernetes.io/projected/d41d976f-3b2d-457c-b9ac-f655f79bc0b2-kube-api-access-cxjq5\") pod \"network-check-target-7lkbc\" (UID: \"d41d976f-3b2d-457c-b9ac-f655f79bc0b2\") " pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:39.455983 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.455956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:39.456100 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.455996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:48:39.458759 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.458706 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:48:39.458877 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.458860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:48:39.467115 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.467097 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:48:39.467167 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:48:39.467160 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:49:43.467144159 +0000 UTC m=+130.446680958 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : secret "metrics-daemon-secret" not found Apr 23 08:48:39.469163 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.469138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7c7ca0ff-2599-42db-a76b-05f1bb8b8d60-original-pull-secret\") pod \"global-pull-secret-syncer-dcmzb\" (UID: \"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60\") " pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:39.528728 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.528670 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h75n2\"" Apr 23 08:48:39.530797 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.530782 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dcmzb" Apr 23 08:48:39.536582 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.536555 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:39.651209 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.651084 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dcmzb"] Apr 23 08:48:39.654139 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:48:39.654113 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c7ca0ff_2599_42db_a76b_05f1bb8b8d60.slice/crio-521992a34062394747a661d893e0ea654c6b7f1953a9c7bdbf5cc7a24aba9227 WatchSource:0}: Error finding container 521992a34062394747a661d893e0ea654c6b7f1953a9c7bdbf5cc7a24aba9227: Status 404 returned error can't find the container with id 521992a34062394747a661d893e0ea654c6b7f1953a9c7bdbf5cc7a24aba9227 Apr 23 08:48:39.662745 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.662724 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7lkbc"] Apr 23 08:48:39.665673 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:48:39.665649 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41d976f_3b2d_457c_b9ac_f655f79bc0b2.slice/crio-a0c71bcd70ac14cefbd803285e760c2a011fbb69dcb99ed3a66d48be817ff0b8 WatchSource:0}: Error finding container a0c71bcd70ac14cefbd803285e760c2a011fbb69dcb99ed3a66d48be817ff0b8: Status 404 returned error can't find the container with id a0c71bcd70ac14cefbd803285e760c2a011fbb69dcb99ed3a66d48be817ff0b8 Apr 23 08:48:39.826501 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.826418 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7lkbc" event={"ID":"d41d976f-3b2d-457c-b9ac-f655f79bc0b2","Type":"ContainerStarted","Data":"a0c71bcd70ac14cefbd803285e760c2a011fbb69dcb99ed3a66d48be817ff0b8"} Apr 23 08:48:39.827416 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:39.827393 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dcmzb" event={"ID":"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60","Type":"ContainerStarted","Data":"521992a34062394747a661d893e0ea654c6b7f1953a9c7bdbf5cc7a24aba9227"} Apr 23 08:48:44.839487 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:44.839443 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7lkbc" event={"ID":"d41d976f-3b2d-457c-b9ac-f655f79bc0b2","Type":"ContainerStarted","Data":"9c3fe010b907219ffeb85e903835fae9e53c8e5a30713c9708362a6c270804c2"} Apr 23 08:48:44.839920 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:44.839579 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:48:44.840757 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:44.840735 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dcmzb" event={"ID":"7c7ca0ff-2599-42db-a76b-05f1bb8b8d60","Type":"ContainerStarted","Data":"75f3da5f3275a6234a32098a3260b3576a90ab88fcb3c518e44d5bd6c65bf174"} Apr 23 08:48:44.856807 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:44.856755 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7lkbc" podStartSLOduration=67.646620773 podStartE2EDuration="1m11.856739308s" podCreationTimestamp="2026-04-23 08:47:33 +0000 UTC" firstStartedPulling="2026-04-23 08:48:39.667320548 +0000 UTC m=+66.646857341" lastFinishedPulling="2026-04-23 08:48:43.877439081 +0000 UTC m=+70.856975876" observedRunningTime="2026-04-23 08:48:44.856484641 +0000 UTC m=+71.836021458" watchObservedRunningTime="2026-04-23 08:48:44.856739308 +0000 UTC m=+71.836276125" Apr 23 08:48:44.871467 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:48:44.871424 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dcmzb" podStartSLOduration=66.645675356 podStartE2EDuration="1m10.871413778s" podCreationTimestamp="2026-04-23 08:47:34 +0000 UTC" firstStartedPulling="2026-04-23 08:48:39.655748094 +0000 UTC m=+66.635284889" lastFinishedPulling="2026-04-23 08:48:43.88148631 +0000 UTC m=+70.861023311" observedRunningTime="2026-04-23 08:48:44.870615082 +0000 UTC m=+71.850151897" watchObservedRunningTime="2026-04-23 08:48:44.871413778 +0000 UTC m=+71.850950594" Apr 23 08:49:11.304547 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:49:11.304505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:49:11.304547 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:49:11.304547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:49:11.305007 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:49:11.304580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:49:11.305007 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:11.304657 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:49:11.305007 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:11.304669 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:49:11.305007 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:11.304675 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4458988b-jshml: secret "image-registry-tls" not found Apr 23 08:49:11.305007 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:11.304669 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:49:11.305007 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:11.304733 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:50:15.304719098 +0000 UTC m=+162.284255892 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:49:11.305007 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:11.304747 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:50:15.304740977 +0000 UTC m=+162.284277770 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:49:11.305007 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:11.304764 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls podName:2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:15.304756158 +0000 UTC m=+162.284292953 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls") pod "image-registry-6f4458988b-jshml" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045") : secret "image-registry-tls" not found Apr 23 08:49:15.846464 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:49:15.846432 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7lkbc" Apr 23 08:49:43.530590 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:49:43.530534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:49:43.531094 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:43.530678 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:49:43.531094 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:49:43.530757 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs podName:4382bbb7-55ae-4eb9-a5be-4925147cf49d nodeName:}" failed. No retries permitted until 2026-04-23 08:51:45.530741228 +0000 UTC m=+252.510278022 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs") pod "network-metrics-daemon-msghc" (UID: "4382bbb7-55ae-4eb9-a5be-4925147cf49d") : secret "metrics-daemon-secret" not found Apr 23 08:50:10.341234 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:50:10.341185 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" Apr 23 08:50:10.431822 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:50:10.431775 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2vhbm" podUID="13dbe402-f1f7-465b-bb2a-cea6503b73fb" Apr 23 08:50:10.431822 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:50:10.431769 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8x8sn" podUID="68bcbcf4-4715-4a78-9508-3415c39698dc" Apr 23 08:50:10.503574 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:10.503549 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8jczv_4d506f42-bf26-438f-96ee-d33036fd434d/dns-node-resolver/0.log" Apr 23 08:50:11.040655 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:11.040623 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2vhbm" Apr 23 08:50:11.040837 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:11.040798 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:50:11.504546 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:11.504527 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kqtvw_b8c9848f-46ab-4976-a1d7-0f02cd542da7/node-ca/0.log" Apr 23 08:50:11.618099 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:50:11.618039 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-msghc" podUID="4382bbb7-55ae-4eb9-a5be-4925147cf49d" Apr 23 08:50:15.050694 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.050656 2579 generic.go:358] "Generic (PLEG): container finished" podID="492bafe0-e29c-40bf-ae28-09c4e7b2e6c5" containerID="38817ebc352938a6be3e0be5e746a05f2381abc4ed8ca106de184bfd5a4b21af" exitCode=255 Apr 23 08:50:15.051173 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.050731 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" event={"ID":"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5","Type":"ContainerDied","Data":"38817ebc352938a6be3e0be5e746a05f2381abc4ed8ca106de184bfd5a4b21af"} Apr 23 08:50:15.051173 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.051106 2579 scope.go:117] "RemoveContainer" containerID="38817ebc352938a6be3e0be5e746a05f2381abc4ed8ca106de184bfd5a4b21af" Apr 23 08:50:15.052011 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.051988 2579 generic.go:358] "Generic (PLEG): container finished" podID="c761ccf7-a9a1-4de6-9eb7-f86af4f698ae" containerID="b3a79dd976808c5e2d456bf33adec10db40e245033a2d4625a5bb442dd15c96a" exitCode=1 Apr 23 08:50:15.052096 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.052067 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" event={"ID":"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae","Type":"ContainerDied","Data":"b3a79dd976808c5e2d456bf33adec10db40e245033a2d4625a5bb442dd15c96a"} Apr 23 08:50:15.052337 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.052322 2579 scope.go:117] "RemoveContainer" containerID="b3a79dd976808c5e2d456bf33adec10db40e245033a2d4625a5bb442dd15c96a" Apr 23 08:50:15.365808 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.365771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:50:15.365808 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.365810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:50:15.366026 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.365843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:50:15.366026 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:50:15.365937 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:50:15.366026 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:50:15.365979 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:50:15.366026 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:50:15.365992 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls podName:13dbe402-f1f7-465b-bb2a-cea6503b73fb nodeName:}" failed. No retries permitted until 2026-04-23 08:52:17.365976225 +0000 UTC m=+284.345513022 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls") pod "dns-default-2vhbm" (UID: "13dbe402-f1f7-465b-bb2a-cea6503b73fb") : secret "dns-default-metrics-tls" not found Apr 23 08:50:15.366173 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:50:15.366092 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert podName:68bcbcf4-4715-4a78-9508-3415c39698dc nodeName:}" failed. No retries permitted until 2026-04-23 08:52:17.366028045 +0000 UTC m=+284.345564842 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert") pod "ingress-canary-8x8sn" (UID: "68bcbcf4-4715-4a78-9508-3415c39698dc") : secret "canary-serving-cert" not found Apr 23 08:50:15.368218 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.368190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"image-registry-6f4458988b-jshml\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:50:15.544652 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.544620 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpxkg\"" Apr 23 08:50:15.552333 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.552293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:50:15.679701 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.679671 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f4458988b-jshml"] Apr 23 08:50:15.683542 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:50:15.683513 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bbc4ff9_11e0_4100_bfc4_a7f9e73b2045.slice/crio-ed172f6e4949c5a7bd6f2449eaa2bfae3c87c3b415d20718d6e2f56ba4c518ae WatchSource:0}: Error finding container ed172f6e4949c5a7bd6f2449eaa2bfae3c87c3b415d20718d6e2f56ba4c518ae: Status 404 returned error can't find the container with id ed172f6e4949c5a7bd6f2449eaa2bfae3c87c3b415d20718d6e2f56ba4c518ae Apr 23 08:50:15.773546 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:15.773505 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:50:16.059222 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:16.059185 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dd44fcd45-t8bll" event={"ID":"492bafe0-e29c-40bf-ae28-09c4e7b2e6c5","Type":"ContainerStarted","Data":"6716f4ca11a52e79e024ec51958aeb69814b66b07b8bd39e3b9970023aa7adbc"} Apr 23 08:50:16.060955 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:16.060923 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" event={"ID":"c761ccf7-a9a1-4de6-9eb7-f86af4f698ae","Type":"ContainerStarted","Data":"df92fc491220e7544ab4e24c46c57aa33114c3342a425bab55b90cee42292c7c"} Apr 23 08:50:16.061201 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:16.061138 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:50:16.061836 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:16.061811 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6db75f98cd-bcmc9" Apr 23 08:50:16.062324 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:16.062305 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4458988b-jshml" event={"ID":"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045","Type":"ContainerStarted","Data":"7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8"} Apr 23 08:50:16.062428 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:16.062327 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4458988b-jshml" event={"ID":"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045","Type":"ContainerStarted","Data":"ed172f6e4949c5a7bd6f2449eaa2bfae3c87c3b415d20718d6e2f56ba4c518ae"} Apr 23 08:50:16.062495 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:16.062443 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:50:16.115344 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:16.115292 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podStartSLOduration=162.11527492 podStartE2EDuration="2m42.11527492s" podCreationTimestamp="2026-04-23 08:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:16.114494603 +0000 UTC m=+163.094031420" watchObservedRunningTime="2026-04-23 08:50:16.11527492 +0000 UTC m=+163.094811735" Apr 23 08:50:22.605688 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:22.605654 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:50:25.606066 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:25.606007 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:50:33.884572 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:33.884539 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wvl7x"] Apr 23 08:50:33.889237 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:33.889216 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:33.893435 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:33.893409 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:50:33.893551 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:33.893484 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:50:33.894839 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:33.894818 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wnmgb\"" Apr 23 08:50:33.894981 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:33.894846 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:50:33.894981 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:33.894866 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:50:33.902530 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:33.902508 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wvl7x"] Apr 23 08:50:34.003401 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.003377 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dcb8455f-4bbc-4c90-b638-7404a9074f11-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.003535 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.003411 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dcb8455f-4bbc-4c90-b638-7404a9074f11-data-volume\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.003535 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.003462 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dcb8455f-4bbc-4c90-b638-7404a9074f11-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.003535 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.003510 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dcb8455f-4bbc-4c90-b638-7404a9074f11-crio-socket\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.003674 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.003547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbl86\" (UniqueName: \"kubernetes.io/projected/dcb8455f-4bbc-4c90-b638-7404a9074f11-kube-api-access-qbl86\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.106241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.104374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dcb8455f-4bbc-4c90-b638-7404a9074f11-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.106241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.104433 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dcb8455f-4bbc-4c90-b638-7404a9074f11-crio-socket\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.106241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.104485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbl86\" (UniqueName: \"kubernetes.io/projected/dcb8455f-4bbc-4c90-b638-7404a9074f11-kube-api-access-qbl86\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.106241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.104582 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dcb8455f-4bbc-4c90-b638-7404a9074f11-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.106241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.104617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dcb8455f-4bbc-4c90-b638-7404a9074f11-data-volume\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.106241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.104963 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dcb8455f-4bbc-4c90-b638-7404a9074f11-data-volume\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.106241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.105470 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dcb8455f-4bbc-4c90-b638-7404a9074f11-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.106241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.105551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dcb8455f-4bbc-4c90-b638-7404a9074f11-crio-socket\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.108586 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.108562 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dcb8455f-4bbc-4c90-b638-7404a9074f11-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.117476 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.117457 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbl86\" (UniqueName: \"kubernetes.io/projected/dcb8455f-4bbc-4c90-b638-7404a9074f11-kube-api-access-qbl86\") pod \"insights-runtime-extractor-wvl7x\" (UID: \"dcb8455f-4bbc-4c90-b638-7404a9074f11\") " pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.197350 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.197295 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wvl7x" Apr 23 08:50:34.313843 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:34.313818 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wvl7x"] Apr 23 08:50:34.316030 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:50:34.315999 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcb8455f_4bbc_4c90_b638_7404a9074f11.slice/crio-03fcbf2531d95275e51d0f591c64514feb0b75479b081abe955695100d800ca8 WatchSource:0}: Error finding container 03fcbf2531d95275e51d0f591c64514feb0b75479b081abe955695100d800ca8: Status 404 returned error can't find the container with id 03fcbf2531d95275e51d0f591c64514feb0b75479b081abe955695100d800ca8 Apr 23 08:50:35.106846 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:35.106820 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wvl7x" event={"ID":"dcb8455f-4bbc-4c90-b638-7404a9074f11","Type":"ContainerStarted","Data":"ffce2dd26b62d3ce92160679212354a7dc1987bc65383b162d7cbafbfdb7c171"} Apr 23 08:50:35.107142 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:35.106854 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wvl7x" event={"ID":"dcb8455f-4bbc-4c90-b638-7404a9074f11","Type":"ContainerStarted","Data":"03fcbf2531d95275e51d0f591c64514feb0b75479b081abe955695100d800ca8"} Apr 23 08:50:35.557142 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:35.557096 2579 patch_prober.go:28] interesting pod/image-registry-6f4458988b-jshml container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:35.557330 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:35.557159 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:36.110599 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:36.110563 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wvl7x" event={"ID":"dcb8455f-4bbc-4c90-b638-7404a9074f11","Type":"ContainerStarted","Data":"c2b50498e684a4b475da59f0117563fda0f34f799bcf68eaca59933cbfc27658"} Apr 23 08:50:37.068771 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:37.068739 2579 patch_prober.go:28] interesting pod/image-registry-6f4458988b-jshml container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:37.068922 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:37.068791 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:37.114950 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:37.114911 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wvl7x" event={"ID":"dcb8455f-4bbc-4c90-b638-7404a9074f11","Type":"ContainerStarted","Data":"b79b5b0dc42f3b627387030cd604a731d758df0fe494ea1edc86a0c7279e2297"} Apr 23 08:50:37.134261 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:37.134205 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wvl7x" podStartSLOduration=1.8720733950000001 podStartE2EDuration="4.134188473s" podCreationTimestamp="2026-04-23 08:50:33 +0000 UTC" firstStartedPulling="2026-04-23 08:50:34.379304609 +0000 UTC m=+181.358841402" lastFinishedPulling="2026-04-23 08:50:36.641419679 +0000 UTC m=+183.620956480" observedRunningTime="2026-04-23 08:50:37.133962978 +0000 UTC m=+184.113499806" watchObservedRunningTime="2026-04-23 08:50:37.134188473 +0000 UTC m=+184.113725288" Apr 23 08:50:45.558416 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:45.558383 2579 patch_prober.go:28] interesting pod/image-registry-6f4458988b-jshml container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:45.558803 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:45.558446 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:46.580836 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.580806 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bw4l5"] Apr 23 08:50:46.583776 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.583759 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.587465 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.587445 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:50:46.587962 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.587945 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:50:46.588342 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.588329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:50:46.588904 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.588875 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:50:46.589009 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.588935 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bp5x8\"" Apr 23 08:50:46.589103 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.589015 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:50:46.589291 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.589275 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:50:46.701480 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.701438 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-sys\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.705530 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.705500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-root\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.705705 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.705609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e48fcbd-4773-4940-92fb-261d5b4e7acf-metrics-client-ca\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.705705 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.705665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-tls\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.705705 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.705686 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-accelerators-collector-config\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.705833 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.705707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.705833 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.705815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-textfile\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.705907 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.705857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsmzp\" (UniqueName: \"kubernetes.io/projected/5e48fcbd-4773-4940-92fb-261d5b4e7acf-kube-api-access-tsmzp\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.705907 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.705880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-wtmp\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806391 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-sys\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806391 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-root\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806650 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e48fcbd-4773-4940-92fb-261d5b4e7acf-metrics-client-ca\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806650 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806453 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-sys\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806650 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-tls\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806650 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806521 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-accelerators-collector-config\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806650 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806548 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806650 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-textfile\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806650 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-root\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806992 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsmzp\" (UniqueName: \"kubernetes.io/projected/5e48fcbd-4773-4940-92fb-261d5b4e7acf-kube-api-access-tsmzp\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806992 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-wtmp\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806992 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806856 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-wtmp\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.806992 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.806931 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-textfile\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.807228 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.807205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-accelerators-collector-config\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.807276 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.807231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e48fcbd-4773-4940-92fb-261d5b4e7acf-metrics-client-ca\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.808994 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.808972 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-tls\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.809112 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.809024 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e48fcbd-4773-4940-92fb-261d5b4e7acf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.823709 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.823685 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsmzp\" (UniqueName: \"kubernetes.io/projected/5e48fcbd-4773-4940-92fb-261d5b4e7acf-kube-api-access-tsmzp\") pod \"node-exporter-bw4l5\" (UID: \"5e48fcbd-4773-4940-92fb-261d5b4e7acf\") " pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.892543 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:46.892454 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bw4l5" Apr 23 08:50:46.900450 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:50:46.900422 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e48fcbd_4773_4940_92fb_261d5b4e7acf.slice/crio-593b66af4ec6875748b84c66651d9bc074d18b58ef81bc0284de2e8f26e8fe17 WatchSource:0}: Error finding container 593b66af4ec6875748b84c66651d9bc074d18b58ef81bc0284de2e8f26e8fe17: Status 404 returned error can't find the container with id 593b66af4ec6875748b84c66651d9bc074d18b58ef81bc0284de2e8f26e8fe17 Apr 23 08:50:47.069005 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:47.068968 2579 patch_prober.go:28] interesting pod/image-registry-6f4458988b-jshml container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:47.069225 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:47.069019 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:47.140557 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:47.140524 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bw4l5" event={"ID":"5e48fcbd-4773-4940-92fb-261d5b4e7acf","Type":"ContainerStarted","Data":"593b66af4ec6875748b84c66651d9bc074d18b58ef81bc0284de2e8f26e8fe17"} Apr 23 08:50:47.687440 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:47.687402 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" podUID="87fd1d2e-4f49-403b-8134-81b4b406186c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:50:48.143954 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:48.143918 2579 generic.go:358] "Generic (PLEG): container finished" podID="5e48fcbd-4773-4940-92fb-261d5b4e7acf" containerID="697cdbe0ca3fc5313c13627d225d68c091191015c9979584938a7d05c5d709e2" exitCode=0 Apr 23 08:50:48.144154 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:48.143991 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bw4l5" event={"ID":"5e48fcbd-4773-4940-92fb-261d5b4e7acf","Type":"ContainerDied","Data":"697cdbe0ca3fc5313c13627d225d68c091191015c9979584938a7d05c5d709e2"} Apr 23 08:50:49.148265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:49.148230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bw4l5" event={"ID":"5e48fcbd-4773-4940-92fb-261d5b4e7acf","Type":"ContainerStarted","Data":"9d7050799580c77d12631524150ee36e3470cc84ca513aa074eef76687500051"} Apr 23 08:50:49.148265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:49.148266 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bw4l5" event={"ID":"5e48fcbd-4773-4940-92fb-261d5b4e7acf","Type":"ContainerStarted","Data":"4a699bb9c511417a870fb54d2f9c5763098e3f901b9190de56fc6cdd878362c5"} Apr 23 08:50:49.169752 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:49.169700 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bw4l5" podStartSLOduration=2.3679532229999998 podStartE2EDuration="3.16968411s" podCreationTimestamp="2026-04-23 08:50:46 +0000 UTC" firstStartedPulling="2026-04-23 08:50:46.902386719 +0000 UTC m=+193.881923519" lastFinishedPulling="2026-04-23 08:50:47.704117602 +0000 UTC m=+194.683654406" observedRunningTime="2026-04-23 08:50:49.167893653 +0000 UTC m=+196.147430470" watchObservedRunningTime="2026-04-23 08:50:49.16968411 +0000 UTC m=+196.149220926" Apr 23 08:50:55.556613 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:55.556576 2579 patch_prober.go:28] interesting pod/image-registry-6f4458988b-jshml container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:55.556989 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:55.556630 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:55.556989 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:55.556665 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:50:55.557124 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:55.557091 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8"} pod="openshift-image-registry/image-registry-6f4458988b-jshml" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 08:50:55.560474 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:55.560447 2579 patch_prober.go:28] interesting pod/image-registry-6f4458988b-jshml container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:55.560630 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:55.560484 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:57.687123 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:50:57.687083 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" podUID="87fd1d2e-4f49-403b-8134-81b4b406186c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:51:05.560611 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:05.560570 2579 patch_prober.go:28] interesting pod/image-registry-6f4458988b-jshml container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:51:05.560985 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:05.560630 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:51:06.565533 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:06.565500 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f4458988b-jshml"] Apr 23 08:51:07.688012 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:07.687976 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" podUID="87fd1d2e-4f49-403b-8134-81b4b406186c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:51:07.688412 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:07.688060 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" Apr 23 08:51:07.688591 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:07.688573 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"cf4941c313eff32d54adc5aeb3a1f56c434bac2af22be7d76e301eaf485a09df"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 08:51:07.688639 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:07.688614 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" podUID="87fd1d2e-4f49-403b-8134-81b4b406186c" containerName="service-proxy" containerID="cri-o://cf4941c313eff32d54adc5aeb3a1f56c434bac2af22be7d76e301eaf485a09df" gracePeriod=30 Apr 23 08:51:08.198686 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:08.198652 2579 generic.go:358] "Generic (PLEG): container finished" podID="87fd1d2e-4f49-403b-8134-81b4b406186c" containerID="cf4941c313eff32d54adc5aeb3a1f56c434bac2af22be7d76e301eaf485a09df" exitCode=2 Apr 23 08:51:08.198834 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:08.198697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" event={"ID":"87fd1d2e-4f49-403b-8134-81b4b406186c","Type":"ContainerDied","Data":"cf4941c313eff32d54adc5aeb3a1f56c434bac2af22be7d76e301eaf485a09df"} Apr 23 08:51:08.198834 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:08.198719 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d74b4fcd6-djg45" event={"ID":"87fd1d2e-4f49-403b-8134-81b4b406186c","Type":"ContainerStarted","Data":"785df2d40ff70ad4f8bcc1731be7bc5e748591b25c852b698ec35b9fa4c75510"} Apr 23 08:51:15.561156 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:15.561125 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:51:20.575020 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:20.574971 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" containerID="cri-o://7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8" gracePeriod=30 Apr 23 08:51:21.230670 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:21.230635 2579 generic.go:358] "Generic (PLEG): container finished" podID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerID="7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8" exitCode=0 Apr 23 08:51:21.230849 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:21.230680 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4458988b-jshml" event={"ID":"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045","Type":"ContainerDied","Data":"7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8"} Apr 23 08:51:21.230849 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:21.230705 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4458988b-jshml" event={"ID":"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045","Type":"ContainerStarted","Data":"e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52"} Apr 23 08:51:21.230849 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:21.230819 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:51:41.236098 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:41.236064 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:51:45.631004 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:45.630944 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:51:45.633265 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:45.633244 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4382bbb7-55ae-4eb9-a5be-4925147cf49d-metrics-certs\") pod \"network-metrics-daemon-msghc\" (UID: \"4382bbb7-55ae-4eb9-a5be-4925147cf49d\") " pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:51:45.709549 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:45.709517 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zffjs\"" Apr 23 08:51:45.717399 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:45.717371 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-msghc" Apr 23 08:51:45.845778 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:45.845755 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-msghc"] Apr 23 08:51:45.848215 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:51:45.848185 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4382bbb7_55ae_4eb9_a5be_4925147cf49d.slice/crio-68a278469ee364632be8e8d2d82a42a1201fb593afdedcc3437521c1fd10321b WatchSource:0}: Error finding container 68a278469ee364632be8e8d2d82a42a1201fb593afdedcc3437521c1fd10321b: Status 404 returned error can't find the container with id 68a278469ee364632be8e8d2d82a42a1201fb593afdedcc3437521c1fd10321b Apr 23 08:51:46.248316 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.248278 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6f4458988b-jshml" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" containerID="cri-o://e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52" gracePeriod=30 Apr 23 08:51:46.297284 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.297248 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-msghc" event={"ID":"4382bbb7-55ae-4eb9-a5be-4925147cf49d","Type":"ContainerStarted","Data":"68a278469ee364632be8e8d2d82a42a1201fb593afdedcc3437521c1fd10321b"} Apr 23 08:51:46.477131 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.477108 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:51:46.537802 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.537775 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") pod \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " Apr 23 08:51:46.537979 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.537842 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-image-registry-private-configuration\") pod \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " Apr 23 08:51:46.537979 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.537955 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-bound-sa-token\") pod \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " Apr 23 08:51:46.538108 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.538003 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-ca-trust-extracted\") pod \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " Apr 23 08:51:46.538108 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.538071 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-installation-pull-secrets\") pod \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " Apr 23 08:51:46.538217 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.538105 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-certificates\") pod \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " Apr 23 08:51:46.538217 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.538138 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-trusted-ca\") pod \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " Apr 23 08:51:46.538217 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.538171 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx548\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-kube-api-access-gx548\") pod \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\" (UID: \"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045\") " Apr 23 08:51:46.538684 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.538660 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:46.538813 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.538744 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:46.540769 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.540747 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:51:46.540977 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.540954 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:51:46.541089 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.540957 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-kube-api-access-gx548" (OuterVolumeSpecName: "kube-api-access-gx548") pod "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045"). InnerVolumeSpecName "kube-api-access-gx548". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:51:46.541165 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.541089 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:51:46.542740 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.542711 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:51:46.548352 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.548328 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" (UID: "2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:51:46.639784 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.639753 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-image-registry-private-configuration\") on node \"ip-10-0-141-232.ec2.internal\" DevicePath \"\"" Apr 23 08:51:46.640170 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.639789 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-bound-sa-token\") on node \"ip-10-0-141-232.ec2.internal\" DevicePath \"\"" Apr 23 08:51:46.640170 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.639806 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-ca-trust-extracted\") on node \"ip-10-0-141-232.ec2.internal\" DevicePath \"\"" Apr 23 08:51:46.640170 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.639820 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-installation-pull-secrets\") on node \"ip-10-0-141-232.ec2.internal\" DevicePath \"\"" Apr 23 08:51:46.640170 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.639840 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-certificates\") on node \"ip-10-0-141-232.ec2.internal\" DevicePath \"\"" Apr 23 08:51:46.640170 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.639854 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-trusted-ca\") on node \"ip-10-0-141-232.ec2.internal\" DevicePath \"\"" Apr 23 08:51:46.640170 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.639867 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gx548\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-kube-api-access-gx548\") on node \"ip-10-0-141-232.ec2.internal\" DevicePath \"\"" Apr 23 08:51:46.640170 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:46.639882 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045-registry-tls\") on node \"ip-10-0-141-232.ec2.internal\" DevicePath \"\"" Apr 23 08:51:47.306030 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.305995 2579 generic.go:358] "Generic (PLEG): container finished" podID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerID="e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52" exitCode=0 Apr 23 08:51:47.306224 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.306084 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4458988b-jshml" event={"ID":"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045","Type":"ContainerDied","Data":"e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52"} Apr 23 08:51:47.306224 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.306100 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4458988b-jshml" Apr 23 08:51:47.306224 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.306128 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4458988b-jshml" event={"ID":"2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045","Type":"ContainerDied","Data":"ed172f6e4949c5a7bd6f2449eaa2bfae3c87c3b415d20718d6e2f56ba4c518ae"} Apr 23 08:51:47.306224 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.306149 2579 scope.go:117] "RemoveContainer" containerID="e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52" Apr 23 08:51:47.307820 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.307798 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-msghc" event={"ID":"4382bbb7-55ae-4eb9-a5be-4925147cf49d","Type":"ContainerStarted","Data":"9c32ed2316b2176aa995ff0daa5f1e1b727ad375e84caaa27907e54fc85dfb31"} Apr 23 08:51:47.307820 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.307822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-msghc" event={"ID":"4382bbb7-55ae-4eb9-a5be-4925147cf49d","Type":"ContainerStarted","Data":"944f353a797b3401d0b6bdc272f671c5625633e60d2417fcdcdac2bbe80bae5f"} Apr 23 08:51:47.314390 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.314318 2579 scope.go:117] "RemoveContainer" containerID="7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8" Apr 23 08:51:47.322452 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.322437 2579 scope.go:117] "RemoveContainer" containerID="e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52" Apr 23 08:51:47.322689 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:51:47.322670 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52\": container with ID starting with e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52 not found: ID does not exist" containerID="e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52" Apr 23 08:51:47.322745 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.322696 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52"} err="failed to get container status \"e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52\": rpc error: code = NotFound desc = could not find container \"e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52\": container with ID starting with e4f0b83a0f9f00905d2a919a7af41e58b508ccfc7d2ad4154de73442400cfc52 not found: ID does not exist" Apr 23 08:51:47.322745 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.322713 2579 scope.go:117] "RemoveContainer" containerID="7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8" Apr 23 08:51:47.322917 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:51:47.322904 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8\": container with ID starting with 7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8 not found: ID does not exist" containerID="7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8" Apr 23 08:51:47.322955 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.322919 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8"} err="failed to get container status \"7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8\": rpc error: code = NotFound desc = could not find container \"7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8\": container with ID starting with 7dd6e4e2d17d4a98e67c2d1c3a111b87491576c4000c7cc98dce26a70ae9f9d8 not found: ID does not exist" Apr 23 08:51:47.341178 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.341145 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-msghc" podStartSLOduration=252.246933942 podStartE2EDuration="4m13.341134812s" podCreationTimestamp="2026-04-23 08:47:34 +0000 UTC" firstStartedPulling="2026-04-23 08:51:45.849952302 +0000 UTC m=+252.829489102" lastFinishedPulling="2026-04-23 08:51:46.944153177 +0000 UTC m=+253.923689972" observedRunningTime="2026-04-23 08:51:47.339566987 +0000 UTC m=+254.319103814" watchObservedRunningTime="2026-04-23 08:51:47.341134812 +0000 UTC m=+254.320671652" Apr 23 08:51:47.377224 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.377205 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f4458988b-jshml"] Apr 23 08:51:47.382760 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.382742 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6f4458988b-jshml"] Apr 23 08:51:47.609699 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:51:47.609622 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" path="/var/lib/kubelet/pods/2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045/volumes" Apr 23 08:52:14.041101 ip-10-0-141-232 kubenswrapper[2579]: E0423 08:52:14.041060 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2vhbm" podUID="13dbe402-f1f7-465b-bb2a-cea6503b73fb" Apr 23 08:52:14.377557 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:14.377474 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2vhbm" Apr 23 08:52:17.466546 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.466492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:52:17.467020 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.466590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:52:17.468860 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.468832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13dbe402-f1f7-465b-bb2a-cea6503b73fb-metrics-tls\") pod \"dns-default-2vhbm\" (UID: \"13dbe402-f1f7-465b-bb2a-cea6503b73fb\") " pod="openshift-dns/dns-default-2vhbm" Apr 23 08:52:17.468963 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.468887 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68bcbcf4-4715-4a78-9508-3415c39698dc-cert\") pod \"ingress-canary-8x8sn\" (UID: \"68bcbcf4-4715-4a78-9508-3415c39698dc\") " pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:52:17.509828 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.509805 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-55hsm\"" Apr 23 08:52:17.517400 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.517380 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8x8sn" Apr 23 08:52:17.631794 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.631767 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8x8sn"] Apr 23 08:52:17.634704 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:52:17.634680 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68bcbcf4_4715_4a78_9508_3415c39698dc.slice/crio-74de2c7915e8715a367dbc8b4a609c19950eda0ee695585e6be0b164697cbba9 WatchSource:0}: Error finding container 74de2c7915e8715a367dbc8b4a609c19950eda0ee695585e6be0b164697cbba9: Status 404 returned error can't find the container with id 74de2c7915e8715a367dbc8b4a609c19950eda0ee695585e6be0b164697cbba9 Apr 23 08:52:17.681204 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.681172 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hskmq\"" Apr 23 08:52:17.688678 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.688657 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2vhbm" Apr 23 08:52:17.806754 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:17.806723 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2vhbm"] Apr 23 08:52:17.809424 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:52:17.809399 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13dbe402_f1f7_465b_bb2a_cea6503b73fb.slice/crio-696ce5589ce9e6c6086c5387d5bbaef143ebef4620a44f9c059569f8567a9871 WatchSource:0}: Error finding container 696ce5589ce9e6c6086c5387d5bbaef143ebef4620a44f9c059569f8567a9871: Status 404 returned error can't find the container with id 696ce5589ce9e6c6086c5387d5bbaef143ebef4620a44f9c059569f8567a9871 Apr 23 08:52:18.387862 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:18.387818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8x8sn" event={"ID":"68bcbcf4-4715-4a78-9508-3415c39698dc","Type":"ContainerStarted","Data":"74de2c7915e8715a367dbc8b4a609c19950eda0ee695585e6be0b164697cbba9"} Apr 23 08:52:18.388752 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:18.388730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2vhbm" event={"ID":"13dbe402-f1f7-465b-bb2a-cea6503b73fb","Type":"ContainerStarted","Data":"696ce5589ce9e6c6086c5387d5bbaef143ebef4620a44f9c059569f8567a9871"} Apr 23 08:52:20.395476 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:20.395444 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2vhbm" event={"ID":"13dbe402-f1f7-465b-bb2a-cea6503b73fb","Type":"ContainerStarted","Data":"ccdf3910a95c30fd45c614b6e3cf6b1b022108a7974a8c25624776e533128cb4"} Apr 23 08:52:20.395476 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:20.395476 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2vhbm" event={"ID":"13dbe402-f1f7-465b-bb2a-cea6503b73fb","Type":"ContainerStarted","Data":"76db31f478e891783fd362bce799ec794f1746df79624d0ee82c5920d6bc1ec3"} Apr 23 08:52:20.395941 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:20.395579 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2vhbm" Apr 23 08:52:20.414753 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:20.414715 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2vhbm" podStartSLOduration=251.577623973 podStartE2EDuration="4m13.41470234s" podCreationTimestamp="2026-04-23 08:48:07 +0000 UTC" firstStartedPulling="2026-04-23 08:52:17.811184757 +0000 UTC m=+284.790721551" lastFinishedPulling="2026-04-23 08:52:19.648263125 +0000 UTC m=+286.627799918" observedRunningTime="2026-04-23 08:52:20.413845104 +0000 UTC m=+287.393381953" watchObservedRunningTime="2026-04-23 08:52:20.41470234 +0000 UTC m=+287.394239155" Apr 23 08:52:21.398895 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:21.398857 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8x8sn" event={"ID":"68bcbcf4-4715-4a78-9508-3415c39698dc","Type":"ContainerStarted","Data":"b242ff36835b223ddd1b6c5a2a71009a4e9c10dc8185c51d2c7abc32637278ee"} Apr 23 08:52:30.401894 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:30.401860 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2vhbm" Apr 23 08:52:30.443160 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:30.443102 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8x8sn" podStartSLOduration=260.090025978 podStartE2EDuration="4m23.443088318s" podCreationTimestamp="2026-04-23 08:48:07 +0000 UTC" firstStartedPulling="2026-04-23 08:52:17.636847481 +0000 UTC m=+284.616384291" lastFinishedPulling="2026-04-23 08:52:20.989909833 +0000 UTC m=+287.969446631" observedRunningTime="2026-04-23 08:52:21.418828084 +0000 UTC m=+288.398364900" watchObservedRunningTime="2026-04-23 08:52:30.443088318 +0000 UTC m=+297.422625134" Apr 23 08:52:33.552724 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:52:33.552700 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:54:53.011992 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.011914 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-7664b"] Apr 23 08:54:53.012411 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.012196 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" Apr 23 08:54:53.012411 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.012207 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" Apr 23 08:54:53.012411 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.012256 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" Apr 23 08:54:53.012411 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.012267 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" Apr 23 08:54:53.014955 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.014938 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-7664b" Apr 23 08:54:53.017740 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.017711 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 08:54:53.017833 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.017742 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 08:54:53.018938 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.018923 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-lf27h\"" Apr 23 08:54:53.024025 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.024002 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-7664b"] Apr 23 08:54:53.107743 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.107699 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89f35cd-bafe-4d48-bbb0-603b9a32034b-bound-sa-token\") pod \"cert-manager-79c8d999ff-7664b\" (UID: \"f89f35cd-bafe-4d48-bbb0-603b9a32034b\") " pod="cert-manager/cert-manager-79c8d999ff-7664b" Apr 23 08:54:53.107743 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.107741 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265jz\" (UniqueName: \"kubernetes.io/projected/f89f35cd-bafe-4d48-bbb0-603b9a32034b-kube-api-access-265jz\") pod \"cert-manager-79c8d999ff-7664b\" (UID: \"f89f35cd-bafe-4d48-bbb0-603b9a32034b\") " pod="cert-manager/cert-manager-79c8d999ff-7664b" Apr 23 08:54:53.208237 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.208200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89f35cd-bafe-4d48-bbb0-603b9a32034b-bound-sa-token\") pod \"cert-manager-79c8d999ff-7664b\" (UID: \"f89f35cd-bafe-4d48-bbb0-603b9a32034b\") " pod="cert-manager/cert-manager-79c8d999ff-7664b" Apr 23 08:54:53.208237 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.208240 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-265jz\" (UniqueName: \"kubernetes.io/projected/f89f35cd-bafe-4d48-bbb0-603b9a32034b-kube-api-access-265jz\") pod \"cert-manager-79c8d999ff-7664b\" (UID: \"f89f35cd-bafe-4d48-bbb0-603b9a32034b\") " pod="cert-manager/cert-manager-79c8d999ff-7664b" Apr 23 08:54:53.216976 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.216942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89f35cd-bafe-4d48-bbb0-603b9a32034b-bound-sa-token\") pod \"cert-manager-79c8d999ff-7664b\" (UID: \"f89f35cd-bafe-4d48-bbb0-603b9a32034b\") " pod="cert-manager/cert-manager-79c8d999ff-7664b" Apr 23 08:54:53.217124 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.217109 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-265jz\" (UniqueName: \"kubernetes.io/projected/f89f35cd-bafe-4d48-bbb0-603b9a32034b-kube-api-access-265jz\") pod \"cert-manager-79c8d999ff-7664b\" (UID: \"f89f35cd-bafe-4d48-bbb0-603b9a32034b\") " pod="cert-manager/cert-manager-79c8d999ff-7664b" Apr 23 08:54:53.324230 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.324131 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-7664b" Apr 23 08:54:53.441481 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.441445 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-7664b"] Apr 23 08:54:53.444383 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:54:53.444354 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf89f35cd_bafe_4d48_bbb0_603b9a32034b.slice/crio-363169a0182e25d6b391edc5b045faa0dbb4d44948f957f81f32c460ae39b934 WatchSource:0}: Error finding container 363169a0182e25d6b391edc5b045faa0dbb4d44948f957f81f32c460ae39b934: Status 404 returned error can't find the container with id 363169a0182e25d6b391edc5b045faa0dbb4d44948f957f81f32c460ae39b934 Apr 23 08:54:53.446167 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.446153 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:54:53.777692 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:53.777661 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-7664b" event={"ID":"f89f35cd-bafe-4d48-bbb0-603b9a32034b","Type":"ContainerStarted","Data":"363169a0182e25d6b391edc5b045faa0dbb4d44948f957f81f32c460ae39b934"} Apr 23 08:54:56.788377 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:56.788337 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-7664b" event={"ID":"f89f35cd-bafe-4d48-bbb0-603b9a32034b","Type":"ContainerStarted","Data":"69d5615187315ff94c82fd1eefac3280cbb32fc39153a29f3f20f2229c541337"} Apr 23 08:54:56.804941 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:54:56.804890 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-7664b" podStartSLOduration=1.528698614 podStartE2EDuration="4.804873193s" podCreationTimestamp="2026-04-23 08:54:52 +0000 UTC" firstStartedPulling="2026-04-23 08:54:53.446276654 +0000 UTC m=+440.425813447" lastFinishedPulling="2026-04-23 08:54:56.722451225 +0000 UTC m=+443.701988026" observedRunningTime="2026-04-23 08:54:56.804261901 +0000 UTC m=+443.783798715" watchObservedRunningTime="2026-04-23 08:54:56.804873193 +0000 UTC m=+443.784410007" Apr 23 08:55:20.408651 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.408620 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl"] Apr 23 08:55:20.409137 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.408862 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" Apr 23 08:55:20.409137 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.408872 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbc4ff9-11e0-4100-bfc4-a7f9e73b2045" containerName="registry" Apr 23 08:55:20.414535 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.414516 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.418907 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.418880 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:55:20.419139 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.419117 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 23 08:55:20.419240 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.419189 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 23 08:55:20.419437 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.419419 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 23 08:55:20.419527 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.419491 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 08:55:20.419527 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.419468 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-8f2c5\"" Apr 23 08:55:20.420796 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.420774 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl"] Apr 23 08:55:20.503794 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.503757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60e4394c-05fa-410e-90d9-1eaa0f0a5294-metrics-certs\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.503980 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.503802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzz2\" (UniqueName: \"kubernetes.io/projected/60e4394c-05fa-410e-90d9-1eaa0f0a5294-kube-api-access-hxzz2\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.503980 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.503876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/60e4394c-05fa-410e-90d9-1eaa0f0a5294-manager-config\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.503980 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.503929 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60e4394c-05fa-410e-90d9-1eaa0f0a5294-cert\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.605279 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.605238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzz2\" (UniqueName: \"kubernetes.io/projected/60e4394c-05fa-410e-90d9-1eaa0f0a5294-kube-api-access-hxzz2\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.605444 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.605288 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/60e4394c-05fa-410e-90d9-1eaa0f0a5294-manager-config\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.605444 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.605312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60e4394c-05fa-410e-90d9-1eaa0f0a5294-cert\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.605444 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.605346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60e4394c-05fa-410e-90d9-1eaa0f0a5294-metrics-certs\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.606069 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.606022 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/60e4394c-05fa-410e-90d9-1eaa0f0a5294-manager-config\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.607768 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.607752 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60e4394c-05fa-410e-90d9-1eaa0f0a5294-cert\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.607895 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.607878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/60e4394c-05fa-410e-90d9-1eaa0f0a5294-metrics-certs\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.614614 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.614584 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzz2\" (UniqueName: \"kubernetes.io/projected/60e4394c-05fa-410e-90d9-1eaa0f0a5294-kube-api-access-hxzz2\") pod \"jobset-controller-manager-58d6848c7d-r7bjl\" (UID: \"60e4394c-05fa-410e-90d9-1eaa0f0a5294\") " pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.724256 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.724163 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:20.851345 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:20.851314 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl"] Apr 23 08:55:20.854832 ip-10-0-141-232 kubenswrapper[2579]: W0423 08:55:20.854807 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60e4394c_05fa_410e_90d9_1eaa0f0a5294.slice/crio-c462ae324f37aac73336f4b0eb98be02146b5124db67d93e5f38fb66b303564e WatchSource:0}: Error finding container c462ae324f37aac73336f4b0eb98be02146b5124db67d93e5f38fb66b303564e: Status 404 returned error can't find the container with id c462ae324f37aac73336f4b0eb98be02146b5124db67d93e5f38fb66b303564e Apr 23 08:55:21.856570 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:21.856532 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" event={"ID":"60e4394c-05fa-410e-90d9-1eaa0f0a5294","Type":"ContainerStarted","Data":"c462ae324f37aac73336f4b0eb98be02146b5124db67d93e5f38fb66b303564e"} Apr 23 08:55:22.860841 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:22.860808 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" event={"ID":"60e4394c-05fa-410e-90d9-1eaa0f0a5294","Type":"ContainerStarted","Data":"0757794d15970d8d24a9338a1e998c7da42f55af611f605841f5487c1911d4d1"} Apr 23 08:55:22.861241 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:22.860921 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 08:55:22.881142 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:22.881089 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" podStartSLOduration=1.239402015 podStartE2EDuration="2.881073113s" podCreationTimestamp="2026-04-23 08:55:20 +0000 UTC" firstStartedPulling="2026-04-23 08:55:20.856498895 +0000 UTC m=+467.836035689" lastFinishedPulling="2026-04-23 08:55:22.498169982 +0000 UTC m=+469.477706787" observedRunningTime="2026-04-23 08:55:22.880576675 +0000 UTC m=+469.860113491" watchObservedRunningTime="2026-04-23 08:55:22.881073113 +0000 UTC m=+469.860609922" Apr 23 08:55:33.869708 ip-10-0-141-232 kubenswrapper[2579]: I0423 08:55:33.869674 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-58d6848c7d-r7bjl" Apr 23 09:27:57.093830 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:27:57.093745 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dcmzb_7c7ca0ff-2599-42db-a76b-05f1bb8b8d60/global-pull-secret-syncer/0.log" Apr 23 09:27:57.308236 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:27:57.308203 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xlcxj_ce69dc2d-3bc0-419d-b6d3-c42290984071/konnectivity-agent/0.log" Apr 23 09:27:57.396021 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:27:57.395935 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-232.ec2.internal_99512a33fd32ce0e80258f2ae365c428/haproxy/0.log" Apr 23 09:28:00.538260 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:00.538230 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bw4l5_5e48fcbd-4773-4940-92fb-261d5b4e7acf/node-exporter/0.log" Apr 23 09:28:00.562793 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:00.562766 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bw4l5_5e48fcbd-4773-4940-92fb-261d5b4e7acf/kube-rbac-proxy/0.log" Apr 23 09:28:00.586320 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:00.586294 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bw4l5_5e48fcbd-4773-4940-92fb-261d5b4e7acf/init-textfile/0.log" Apr 23 09:28:04.298367 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.298332 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql"] Apr 23 09:28:04.301688 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.301666 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.304318 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.304294 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-29qjf\"/\"default-dockercfg-nnl8j\"" Apr 23 09:28:04.304602 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.304583 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-29qjf\"/\"openshift-service-ca.crt\"" Apr 23 09:28:04.305537 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.305517 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-29qjf\"/\"kube-root-ca.crt\"" Apr 23 09:28:04.310111 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.310077 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql"] Apr 23 09:28:04.400874 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.400836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-sys\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.400874 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.400872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-proc\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.401099 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.400897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-lib-modules\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.401099 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.400969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-podres\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.401099 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.400987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg59k\" (UniqueName: \"kubernetes.io/projected/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-kube-api-access-xg59k\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502377 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-podres\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502377 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xg59k\" (UniqueName: \"kubernetes.io/projected/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-kube-api-access-xg59k\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502594 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502405 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-sys\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502594 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-proc\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502594 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502445 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-lib-modules\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502594 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-sys\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502594 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-proc\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502594 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-podres\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.502594 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.502554 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-lib-modules\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.511431 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.511400 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg59k\" (UniqueName: \"kubernetes.io/projected/9ef03fe8-eb9e-4939-af11-3f6a94ff1dca-kube-api-access-xg59k\") pod \"perf-node-gather-daemonset-klxql\" (UID: \"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.612015 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.611918 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.664659 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.664628 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2vhbm_13dbe402-f1f7-465b-bb2a-cea6503b73fb/dns/0.log" Apr 23 09:28:04.690673 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.690633 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2vhbm_13dbe402-f1f7-465b-bb2a-cea6503b73fb/kube-rbac-proxy/0.log" Apr 23 09:28:04.730093 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.730057 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql"] Apr 23 09:28:04.733403 ip-10-0-141-232 kubenswrapper[2579]: W0423 09:28:04.733376 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9ef03fe8_eb9e_4939_af11_3f6a94ff1dca.slice/crio-4cc4a682bfd446d5ae7c8200aa147e7b690d359dd79bc8cea0cea459dc03bd8d WatchSource:0}: Error finding container 4cc4a682bfd446d5ae7c8200aa147e7b690d359dd79bc8cea0cea459dc03bd8d: Status 404 returned error can't find the container with id 4cc4a682bfd446d5ae7c8200aa147e7b690d359dd79bc8cea0cea459dc03bd8d Apr 23 09:28:04.735145 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.735129 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:28:04.862949 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.862865 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8jczv_4d506f42-bf26-438f-96ee-d33036fd434d/dns-node-resolver/0.log" Apr 23 09:28:04.907740 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.907706 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" event={"ID":"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca","Type":"ContainerStarted","Data":"50671fbbd515b2db97257453233dbe3418dd0a67d1c3c6c6d6a62bf12d3a6d2e"} Apr 23 09:28:04.907740 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.907744 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" event={"ID":"9ef03fe8-eb9e-4939-af11-3f6a94ff1dca","Type":"ContainerStarted","Data":"4cc4a682bfd446d5ae7c8200aa147e7b690d359dd79bc8cea0cea459dc03bd8d"} Apr 23 09:28:04.908003 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.907829 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:04.924900 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:04.924842 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" podStartSLOduration=0.924821954 podStartE2EDuration="924.821954ms" podCreationTimestamp="2026-04-23 09:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:28:04.923998344 +0000 UTC m=+2431.903535157" watchObservedRunningTime="2026-04-23 09:28:04.924821954 +0000 UTC m=+2431.904358772" Apr 23 09:28:05.463572 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:05.463538 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kqtvw_b8c9848f-46ab-4976-a1d7-0f02cd542da7/node-ca/0.log" Apr 23 09:28:06.719424 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:06.719396 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8x8sn_68bcbcf4-4715-4a78-9508-3415c39698dc/serve-healthcheck-canary/0.log" Apr 23 09:28:07.309179 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:07.309154 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wvl7x_dcb8455f-4bbc-4c90-b638-7404a9074f11/kube-rbac-proxy/0.log" Apr 23 09:28:07.332903 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:07.332862 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wvl7x_dcb8455f-4bbc-4c90-b638-7404a9074f11/exporter/0.log" Apr 23 09:28:07.358762 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:07.358737 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wvl7x_dcb8455f-4bbc-4c90-b638-7404a9074f11/extractor/0.log" Apr 23 09:28:08.978791 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:08.978764 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-58d6848c7d-r7bjl_60e4394c-05fa-410e-90d9-1eaa0f0a5294/manager/0.log" Apr 23 09:28:10.920461 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:10.920431 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-klxql" Apr 23 09:28:14.357541 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.357510 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr5f7_0b4f3610-c641-4979-b0b6-27f27e89efe0/kube-multus-additional-cni-plugins/0.log" Apr 23 09:28:14.401828 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.401793 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr5f7_0b4f3610-c641-4979-b0b6-27f27e89efe0/egress-router-binary-copy/0.log" Apr 23 09:28:14.439362 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.439327 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr5f7_0b4f3610-c641-4979-b0b6-27f27e89efe0/cni-plugins/0.log" Apr 23 09:28:14.479542 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.479510 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr5f7_0b4f3610-c641-4979-b0b6-27f27e89efe0/bond-cni-plugin/0.log" Apr 23 09:28:14.514976 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.514949 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr5f7_0b4f3610-c641-4979-b0b6-27f27e89efe0/routeoverride-cni/0.log" Apr 23 09:28:14.551559 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.551531 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr5f7_0b4f3610-c641-4979-b0b6-27f27e89efe0/whereabouts-cni-bincopy/0.log" Apr 23 09:28:14.588579 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.588497 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr5f7_0b4f3610-c641-4979-b0b6-27f27e89efe0/whereabouts-cni/0.log" Apr 23 09:28:14.705226 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.705191 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lh64p_53bacb41-fc36-4333-8726-ced662fc325c/kube-multus/0.log" Apr 23 09:28:14.817625 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.817594 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-msghc_4382bbb7-55ae-4eb9-a5be-4925147cf49d/network-metrics-daemon/0.log" Apr 23 09:28:14.860830 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:14.860744 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-msghc_4382bbb7-55ae-4eb9-a5be-4925147cf49d/kube-rbac-proxy/0.log" Apr 23 09:28:16.093688 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:16.093651 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ltg5_ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a/ovn-controller/0.log" Apr 23 09:28:16.141155 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:16.141125 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ltg5_ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a/ovn-acl-logging/0.log" Apr 23 09:28:16.170625 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:16.170593 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ltg5_ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a/kube-rbac-proxy-node/0.log" Apr 23 09:28:16.208778 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:16.208741 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ltg5_ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:28:16.231063 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:16.231027 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ltg5_ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a/northd/0.log" Apr 23 09:28:16.260525 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:16.260489 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ltg5_ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a/nbdb/0.log" Apr 23 09:28:16.289497 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:16.289460 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ltg5_ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a/sbdb/0.log" Apr 23 09:28:16.419166 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:16.419080 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ltg5_ee4b8a45-674b-4a0c-ae63-3fe36bbbf85a/ovnkube-controller/0.log" Apr 23 09:28:18.114804 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:18.114771 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7lkbc_d41d976f-3b2d-457c-b9ac-f655f79bc0b2/network-check-target-container/0.log" Apr 23 09:28:19.286886 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:19.286854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-z6vqc_18df44e5-6990-4d9e-89c1-25b5e449602b/iptables-alerter/0.log" Apr 23 09:28:20.138527 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:20.138501 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8x5bw_321aa24e-593f-4b13-bd8d-31c635ce80f5/tuned/0.log" Apr 23 09:28:23.986697 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:23.986669 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dtfbv_d7eddd42-5408-4933-957d-9e3947f66e44/csi-driver/0.log" Apr 23 09:28:24.012715 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:24.012691 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dtfbv_d7eddd42-5408-4933-957d-9e3947f66e44/csi-node-driver-registrar/0.log" Apr 23 09:28:24.038691 ip-10-0-141-232 kubenswrapper[2579]: I0423 09:28:24.038669 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dtfbv_d7eddd42-5408-4933-957d-9e3947f66e44/csi-liveness-probe/0.log"