Apr 20 09:58:56.914944 ip-10-0-137-106 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 09:58:56.914959 ip-10-0-137-106 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 09:58:56.914969 ip-10-0-137-106 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 09:58:56.915309 ip-10-0-137-106 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 09:59:07.094288 ip-10-0-137-106 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 09:59:07.094299 ip-10-0-137-106 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 78dccf1e353d473596030ab68729f3da -- Apr 20 10:01:12.220688 ip-10-0-137-106 systemd[1]: Starting Kubernetes Kubelet... Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.657848 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660206 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660218 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660222 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660228 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660232 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660235 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660238 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:12.712863 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660241 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660244 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660247 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660250 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660253 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660258 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660260 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660263 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660266 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660268 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660278 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660281 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660284 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660287 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660290 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660293 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660296 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660299 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660302 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660304 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:12.713876 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660307 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660310 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660313 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660315 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660318 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660321 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660324 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660326 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660329 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660331 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660334 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660337 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660340 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660360 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660364 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660366 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660369 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660372 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660376 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660386 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:12.714868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660390 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660393 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660395 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660398 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660401 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660403 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660406 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660415 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660419 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660422 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660425 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660428 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660431 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660434 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660436 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660439 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660441 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660444 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660447 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660450 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:12.715533 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660456 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660459 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660462 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660464 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660468 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660471 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660473 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660476 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660479 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660481 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660484 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660486 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660491 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660495 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660498 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660501 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660504 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660506 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660509 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:12.716440 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660903 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660907 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660910 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660913 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660916 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660918 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660921 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660924 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660926 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660929 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660932 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660935 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660939 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660942 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660944 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660947 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660950 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660952 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660955 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660958 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:12.716985 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660960 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660963 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660965 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660969 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660971 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660975 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660979 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660981 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660984 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660987 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660990 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660993 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660995 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.660998 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661000 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661003 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661006 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661008 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661011 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:12.717582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661014 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661017 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661019 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661021 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661024 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661026 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661030 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661033 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661035 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661038 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661040 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661043 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661045 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661048 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661050 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661053 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661055 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661058 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661061 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661063 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:12.718379 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661065 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661068 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661071 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661074 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661076 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661079 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661081 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661084 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661087 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661089 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661092 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661095 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661099 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661103 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661106 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661108 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661111 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661113 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661116 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:12.718955 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661120 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661122 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661125 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661128 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661130 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661133 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661135 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.661138 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662293 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662305 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662312 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662317 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662323 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662327 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662332 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662336 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662340 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662356 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662363 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662369 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662374 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662379 2566 flags.go:64] FLAG: --cgroup-root="" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662383 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 10:01:12.719580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662386 2566 flags.go:64] FLAG: --client-ca-file="" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662389 2566 flags.go:64] FLAG: --cloud-config="" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662392 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662395 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662399 2566 flags.go:64] FLAG: --cluster-domain="" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662402 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662406 2566 flags.go:64] FLAG: --config-dir="" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662408 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662412 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662421 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662427 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662431 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662434 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662439 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662442 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662445 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662448 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662451 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662456 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662459 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662462 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662465 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662468 2566 flags.go:64] FLAG: --enable-server="true" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662472 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662477 2566 flags.go:64] FLAG: --event-burst="100" Apr 20 10:01:12.720318 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662480 2566 flags.go:64] FLAG: --event-qps="50" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662483 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662486 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662489 2566 flags.go:64] FLAG: --eviction-hard="" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662493 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662497 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662500 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662503 2566 flags.go:64] FLAG: --eviction-soft="" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662506 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662509 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662512 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662515 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662518 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662521 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662524 2566 flags.go:64] FLAG: --feature-gates="" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662528 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662532 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662535 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662539 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662542 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662547 2566 flags.go:64] FLAG: --help="false" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662550 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-137-106.ec2.internal" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662553 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662556 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 10:01:12.721052 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662559 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662563 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662566 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662569 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662572 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662575 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662578 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662581 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662584 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662587 2566 flags.go:64] FLAG: --kube-reserved="" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662590 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662593 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662596 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662599 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662602 2566 flags.go:64] FLAG: --lock-file="" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662605 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662608 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662611 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662616 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662620 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662623 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662625 2566 flags.go:64] FLAG: --logging-format="text" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662628 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662631 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 10:01:12.828186 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662635 2566 flags.go:64] FLAG: --manifest-url="" Apr 20 10:01:12.767406 ip-10-0-137-106 systemd[1]: Started Kubernetes Kubelet. Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662638 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662643 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662646 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662652 2566 flags.go:64] FLAG: --max-pods="110" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662655 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662659 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662662 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662665 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662668 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662671 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662674 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662682 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662685 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662688 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662691 2566 flags.go:64] FLAG: --pod-cidr="" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662694 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662701 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662704 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662707 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662710 2566 flags.go:64] FLAG: --port="10250" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662713 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662716 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f50c6830dfca3fd7" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662719 2566 flags.go:64] FLAG: --qos-reserved="" Apr 20 10:01:12.829109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662722 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662725 2566 flags.go:64] FLAG: --register-node="true" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662728 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662731 2566 flags.go:64] FLAG: --register-with-taints="" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662735 2566 flags.go:64] FLAG: --registry-burst="10" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662738 2566 flags.go:64] FLAG: --registry-qps="5" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662741 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662744 2566 flags.go:64] FLAG: --reserved-memory="" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662748 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662752 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662755 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662758 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662763 2566 flags.go:64] FLAG: --runonce="false" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662766 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662770 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662773 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662776 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662779 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662782 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662785 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662792 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662796 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662798 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662802 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662805 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662808 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 10:01:12.830272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662811 2566 flags.go:64] FLAG: --system-cgroups="" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662814 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662820 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662823 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662826 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662829 2566 flags.go:64] FLAG: --tls-min-version="" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662832 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662835 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662838 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662841 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662844 2566 flags.go:64] FLAG: --v="2" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662849 2566 flags.go:64] FLAG: --version="false" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662853 2566 flags.go:64] FLAG: --vmodule="" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662857 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.662860 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662966 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662969 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662973 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662977 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662981 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662983 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662986 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662988 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:12.831069 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662992 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.662996 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663000 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663003 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663005 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663009 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663011 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663014 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663016 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663019 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663021 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663024 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663027 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663029 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663032 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663035 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663038 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663040 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663042 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:13.065390 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663045 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663048 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663051 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663053 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663056 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663060 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663063 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663066 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663069 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663072 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663075 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663078 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663080 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663083 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663085 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663088 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663090 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663093 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663095 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663098 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:13.066154 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663101 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663104 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663106 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663109 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663112 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663114 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663116 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663119 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663121 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663124 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663126 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663129 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663131 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663134 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663136 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663139 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663142 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663145 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663148 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663150 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:13.066797 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663155 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663158 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663160 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663163 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663166 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663168 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663171 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663173 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663176 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663179 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663181 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663183 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663186 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663189 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663191 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663195 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663199 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663201 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:13.067542 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.663204 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.663835 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.672719 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.672739 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672791 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672796 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672800 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672803 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672806 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672809 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672812 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672814 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672817 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672819 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672822 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:13.068132 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672825 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672828 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672830 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672833 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672836 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672838 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672841 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672844 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672846 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672849 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672851 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672854 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672857 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672860 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672862 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672865 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672867 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672870 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672872 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672875 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:13.068720 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672879 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672882 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672885 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672888 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672890 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672893 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672895 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672898 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672900 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672904 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672908 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672911 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672913 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672916 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672919 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672921 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672924 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672928 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672930 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672933 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:13.069368 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672936 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672939 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672942 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672945 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672947 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672950 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672952 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672954 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672957 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672960 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672962 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672964 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672967 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672970 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672973 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672976 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672979 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672984 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672987 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672991 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:13.070006 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672994 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.672997 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673000 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673002 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673005 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673008 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673011 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673013 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673016 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673021 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673024 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673026 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673029 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673031 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673034 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.673040 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:13.070837 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673184 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673191 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673194 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673197 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673200 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673203 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673206 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673209 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673212 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673217 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673221 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673225 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673229 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673232 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673235 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673238 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673241 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673243 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673246 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:13.071336 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673249 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673251 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673254 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673256 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673259 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673261 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673266 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673268 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673271 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673274 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673277 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673279 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673282 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673284 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673287 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673289 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673292 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673294 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673297 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673300 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:13.071890 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673302 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673305 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673308 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673311 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673314 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673317 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673319 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673322 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673325 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673327 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673331 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673334 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673337 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673339 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673342 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673361 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673364 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673366 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673370 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:13.072412 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673373 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673375 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673378 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673380 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673383 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673385 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673388 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673390 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673393 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673395 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673398 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673401 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673403 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673406 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673408 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673412 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673414 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673417 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673427 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673430 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:13.072898 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673432 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673435 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673437 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673440 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673443 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673445 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673448 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:12.673450 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.673455 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.674108 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.676245 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.677255 2566 server.go:1019] "Starting client certificate rotation" Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.677364 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.677403 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 10:01:13.073414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.701653 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 10:01:13.073819 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.708253 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 10:01:13.073819 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.723169 2566 log.go:25] "Validated CRI v1 runtime API" Apr 20 10:01:13.073819 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.728949 2566 log.go:25] "Validated CRI v1 image API" Apr 20 10:01:13.073819 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.730904 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 10:01:13.073819 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.734959 2566 fs.go:135] Filesystem UUIDs: map[35a4eacc-696c-4bb7-9543-0ae67d8ef9db:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 7f05b8fb-937a-4e61-8b80-b282bfffaa1d:/dev/nvme0n1p3] Apr 20 10:01:13.073819 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.734976 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 10:01:13.073819 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.736213 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.741110 2566 manager.go:217] Machine: {Timestamp:2026-04-20 10:01:12.7391784 +0000 UTC m=+0.398841071 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3090113 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27c1384ded801d670b93645b2120a9 SystemUUID:ec27c138-4ded-801d-670b-93645b2120a9 BootID:78dccf1e-353d-4735-9603-0ab68729f3da Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5b:2e:e1:1f:fd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5b:2e:e1:1f:fd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:ea:c1:d4:f0:c4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.741899 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.742001 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.743210 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.743241 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-106.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.743413 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.743422 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.743440 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.744058 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.745270 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.745408 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.747527 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.747545 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.747558 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.747569 2566 kubelet.go:397] "Adding apiserver pod source" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.747580 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.748848 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 10:01:13.074008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.748868 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.751968 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.755833 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757179 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757202 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757213 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757230 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757239 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757249 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757259 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757267 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757279 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.757289 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.760600 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.761215 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.762260 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.762278 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.763365 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-106.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.763394 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.766467 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.766524 2566 server.go:1295] "Started kubelet" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.766640 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.766632 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.766701 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.768200 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.768709 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.773592 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.773674 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-106.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.774147 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.773660 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-106.ec2.internal.18a80862a50b9238 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-106.ec2.internal,UID:ip-10-0-137-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-106.ec2.internal,},FirstTimestamp:2026-04-20 10:01:12.766476856 +0000 UTC m=+0.426139527,LastTimestamp:2026-04-20 10:01:12.766476856 +0000 UTC m=+0.426139527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-106.ec2.internal,}" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.774835 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.774851 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.774950 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.775043 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.775053 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.775165 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.775960 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.775975 2566 factory.go:55] Registering systemd factory Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.775984 2566 factory.go:223] Registration of the systemd container factory successfully Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.776949 2566 factory.go:153] Registering CRI-O factory Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.776965 2566 factory.go:223] Registration of the crio container factory successfully Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.776989 2566 factory.go:103] Registering Raw factory Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.777003 2566 manager.go:1196] Started watching for new ooms in manager Apr 20 10:01:13.074517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.777567 2566 manager.go:319] Starting recovery of all containers Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.778258 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.781046 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-106.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.781124 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.785380 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-thwg8" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.788533 2566 manager.go:324] Recovery completed Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.794877 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-thwg8" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.795218 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.797845 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.797871 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.797881 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.798290 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.798297 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.798312 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.800315 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-106.ec2.internal.18a80862a6ea73e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-106.ec2.internal,UID:ip-10-0-137-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-106.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-106.ec2.internal,},FirstTimestamp:2026-04-20 10:01:12.797860833 +0000 UTC m=+0.457523504,LastTimestamp:2026-04-20 10:01:12.797860833 +0000 UTC m=+0.457523504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-106.ec2.internal,}" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.800572 2566 policy_none.go:49] "None policy: Start" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.800590 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.800603 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.842039 2566 manager.go:341] "Starting Device Plugin manager" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.842088 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.842100 2566 server.go:85] "Starting device plugin registration server" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.842382 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.842394 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.842637 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.842697 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.842704 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.843233 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.843264 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.853230 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.854469 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.854493 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.854509 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.854516 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.854551 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.857627 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.943135 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.944314 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.944380 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:13.075684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.944392 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.944442 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.953994 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.954017 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-106.ec2.internal\": node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.954887 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal"] Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.954954 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.956244 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.956271 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.956290 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.959840 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.960002 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.960034 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.960663 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.960685 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.960694 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.960705 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.960706 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.960716 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.962095 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.962133 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.963039 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.963063 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.963073 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.976242 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.976280 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:12.976314 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/daadd8592f265af4bb30938041dae753-config\") pod \"kube-apiserver-proxy-ip-10-0-137-106.ec2.internal\" (UID: \"daadd8592f265af4bb30938041dae753\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.988214 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-106.ec2.internal\" not found" node="ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.988909 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.076694 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:12.996604 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-106.ec2.internal\" not found" node="ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.077499 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.077298 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.077499 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.077326 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.077499 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.077365 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/daadd8592f265af4bb30938041dae753-config\") pod \"kube-apiserver-proxy-ip-10-0-137-106.ec2.internal\" (UID: \"daadd8592f265af4bb30938041dae753\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.077499 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.077381 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.077499 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.077396 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/daadd8592f265af4bb30938041dae753-config\") pod \"kube-apiserver-proxy-ip-10-0-137-106.ec2.internal\" (UID: \"daadd8592f265af4bb30938041dae753\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.077499 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.077403 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.089385 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.089342 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.190028 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.189999 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.289778 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.289754 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.290835 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.290817 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.298394 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.298375 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.391061 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.391027 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.491596 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.491555 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 20 10:01:13.585486 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.585420 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:13.675793 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.675757 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.676852 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.676827 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 10:01:13.676995 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.676934 2566 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a1d67bc70229748519b19bda412858fd-4c86189ea09433c0.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": write tcp 10.0.137.106:56494->100.29.209.28:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.676995 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.676963 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 20 10:01:13.676995 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.676968 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="an error on the server (\"unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close\") has prevented the request from succeeding" Apr 20 10:01:13.677109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.676989 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="an error on the server (\"unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close\") has prevented the request from succeeding" Apr 20 10:01:13.696253 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.696221 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 10:01:13.747951 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.747916 2566 apiserver.go:52] "Watching apiserver" Apr 20 10:01:13.755239 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.755215 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 10:01:13.755671 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.755649 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-nhf2t","openshift-dns/node-resolver-tjdkv","openshift-image-registry/node-ca-jtlqx","openshift-multus/multus-additional-cni-plugins-9z29g","openshift-multus/network-metrics-daemon-dhkq5","openshift-network-diagnostics/network-check-target-kd7l2","openshift-network-operator/iptables-alerter-wqpqq","openshift-ovn-kubernetes/ovnkube-node-69b6q","kube-system/konnectivity-agent-6bwlp","kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd","openshift-multus/multus-smhx6"] Apr 20 10:01:13.758961 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.758939 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.760931 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.760908 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 10:01:13.761034 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.760945 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 10:01:13.761034 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.760969 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mbqcm\"" Apr 20 10:01:13.761142 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.761129 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 10:01:13.763010 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.762990 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:13.763112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.763009 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.764822 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.764805 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:13.765092 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.764949 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:13.765092 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.764991 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sf22m\"" Apr 20 10:01:13.765092 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.765007 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cnnxf\"" Apr 20 10:01:13.765092 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.764990 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 10:01:13.765092 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.765061 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 10:01:13.765369 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.765238 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.767012 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.766993 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 10:01:13.767137 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.767120 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mspqh\"" Apr 20 10:01:13.767137 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.767134 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 10:01:13.767447 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.767427 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.769029 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.769007 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 10:01:13.770074 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.769449 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 10:01:13.770074 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.769653 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 10:01:13.770074 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.769893 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 10:01:13.770074 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.769976 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 10:01:13.770272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.770190 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zvgjl\"" Apr 20 10:01:13.771881 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.771860 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:13.771957 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.771922 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:13.773824 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.773807 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 10:01:13.773917 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.773848 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:13.773980 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.773911 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:13.773980 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.773961 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.775703 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.775685 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:13.775873 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.775854 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n6kl2\"" Apr 20 10:01:13.775952 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.775921 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:13.776010 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.775974 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 10:01:13.776272 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.776254 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.778163 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.778144 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-w9fd6\"" Apr 20 10:01:13.778311 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.778297 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 10:01:13.778422 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.778401 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 10:01:13.778525 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.778405 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 10:01:13.778588 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.778532 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 10:01:13.778658 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.778640 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 10:01:13.778728 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.778714 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 10:01:13.778786 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.778755 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.781056 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781038 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.781133 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781072 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.781133 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781094 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmnvz\" (UniqueName: \"kubernetes.io/projected/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-kube-api-access-dmnvz\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:13.781231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781123 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cnibin\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.781231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781155 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-kubernetes\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.781231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781175 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-node-log\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.781231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781199 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-cni-bin\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.781231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781222 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-ovnkube-config\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781245 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-env-overrides\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781268 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn962\" (UniqueName: \"kubernetes.io/projected/1bd60891-3cb2-4033-81b3-819a1fd45edd-kube-api-access-xn962\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781282 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cni-binary-copy\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781324 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-run-netns\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781388 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-log-socket\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781418 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-host-slash\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781437 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-run\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781455 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-lib-modules\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.781483 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781468 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-ovn\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.781861 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781540 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lth4\" (UniqueName: \"kubernetes.io/projected/cd566bd9-42a8-48e9-889b-d01eee4488c2-kube-api-access-4lth4\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.781861 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781779 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-system-cni-dir\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.781861 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781821 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-os-release\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.781861 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781852 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-sys\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.782036 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781885 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zf2p\" (UniqueName: \"kubernetes.io/projected/580807b1-acaf-4082-b5c8-ab84f495b516-kube-api-access-7zf2p\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.782036 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.782036 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781941 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-var-lib-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.782036 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.781988 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysconfig\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.782036 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782023 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b5da0aba-0c60-4b76-b8af-041b04e6fc2b-agent-certs\") pod \"konnectivity-agent-6bwlp\" (UID: \"b5da0aba-0c60-4b76-b8af-041b04e6fc2b\") " pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:13.782254 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782055 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-modprobe-d\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.782386 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782366 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysctl-conf\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.782753 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bd60891-3cb2-4033-81b3-819a1fd45edd-hosts-file\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.782753 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782557 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysctl-d\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.782753 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782560 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 10:01:13.782753 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782591 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-slash\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.782753 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782682 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-n4zld\"" Apr 20 10:01:13.783038 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782887 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-iptables-alerter-script\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.783038 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.782934 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcjlf\" (UniqueName: \"kubernetes.io/projected/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-kube-api-access-vcjlf\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.783127 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783056 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-kubelet\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.783127 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783074 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 10:01:13.783127 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783086 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 10:01:13.783257 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783108 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.783309 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783254 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 10:01:13.783309 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783281 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-tuned\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.783492 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k75jj\" (UniqueName: \"kubernetes.io/projected/ed2f027e-e531-41a7-8185-1a14d9f86cb2-kube-api-access-k75jj\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.783492 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783384 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-host\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.783492 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783435 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cspcd\"" Apr 20 10:01:13.783492 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783435 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-systemd-units\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.783681 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783492 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd566bd9-42a8-48e9-889b-d01eee4488c2-host\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.783681 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783521 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cd566bd9-42a8-48e9-889b-d01eee4488c2-serviceca\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.783681 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783571 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b5da0aba-0c60-4b76-b8af-041b04e6fc2b-konnectivity-ca\") pod \"konnectivity-agent-6bwlp\" (UID: \"b5da0aba-0c60-4b76-b8af-041b04e6fc2b\") " pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:13.783681 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783622 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-ovnkube-script-lib\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.783681 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783658 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-etc-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.783899 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783717 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bd60891-3cb2-4033-81b3-819a1fd45edd-tmp-dir\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.783899 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783746 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.783899 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783776 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:13.783899 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783805 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqgh4\" (UniqueName: \"kubernetes.io/projected/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-kube-api-access-qqgh4\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.783899 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-systemd\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.783899 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783885 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-var-lib-kubelet\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.784114 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783923 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-tmp\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.784114 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783954 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-cni-netd\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.784114 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.783983 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-systemd\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.784114 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.784027 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-run-ovn-kubernetes\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.784114 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.784069 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.784362 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.784144 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:13.784362 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.784187 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/580807b1-acaf-4082-b5c8-ab84f495b516-ovn-node-metrics-cert\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.786414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.786393 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 10:01:13.796142 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.796110 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 09:56:12 +0000 UTC" deadline="2027-11-15 21:18:13.789413632 +0000 UTC" Apr 20 10:01:13.796142 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.796137 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13787h16m59.993280059s" Apr 20 10:01:13.811244 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.811220 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-szqzf" Apr 20 10:01:13.819583 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.819563 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-szqzf" Apr 20 10:01:13.876026 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.876003 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 10:01:13.880145 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:13.880109 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaadd8592f265af4bb30938041dae753.slice/crio-0b5ec53ae759ab86901351ea3a23d599b1579a6e22c7ac4d292197be162fee86 WatchSource:0}: Error finding container 0b5ec53ae759ab86901351ea3a23d599b1579a6e22c7ac4d292197be162fee86: Status 404 returned error can't find the container with id 0b5ec53ae759ab86901351ea3a23d599b1579a6e22c7ac4d292197be162fee86 Apr 20 10:01:13.884102 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884081 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:01:13.884534 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.884617 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884554 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-etc-kubernetes\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.884617 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884586 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-var-lib-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.884617 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884603 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.884778 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884611 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysconfig\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.884778 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884666 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysconfig\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.884778 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884674 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-var-lib-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.884778 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b5da0aba-0c60-4b76-b8af-041b04e6fc2b-agent-certs\") pod \"konnectivity-agent-6bwlp\" (UID: \"b5da0aba-0c60-4b76-b8af-041b04e6fc2b\") " pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:13.884778 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884726 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-modprobe-d\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.884778 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884753 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysctl-conf\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884785 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-etc-selinux\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884811 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bd60891-3cb2-4033-81b3-819a1fd45edd-hosts-file\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884837 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysctl-d\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884852 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-modprobe-d\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884904 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-slash\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884913 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bd60891-3cb2-4033-81b3-819a1fd45edd-hosts-file\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884934 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysctl-conf\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884980 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-slash\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.884983 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-socket-dir-parent\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885020 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-sysctl-d\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885032 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-iptables-alerter-script\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885061 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcjlf\" (UniqueName: \"kubernetes.io/projected/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-kube-api-access-vcjlf\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.885112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885088 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-kubelet\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885116 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885139 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885165 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-hostroot\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885197 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c64lg\" (UniqueName: \"kubernetes.io/projected/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-kube-api-access-c64lg\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885226 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-tuned\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885265 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885293 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k75jj\" (UniqueName: \"kubernetes.io/projected/ed2f027e-e531-41a7-8185-1a14d9f86cb2-kube-api-access-k75jj\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885319 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-host\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-systemd-units\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885396 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-cni-multus\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd566bd9-42a8-48e9-889b-d01eee4488c2-host\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885439 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885466 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cd566bd9-42a8-48e9-889b-d01eee4488c2-serviceca\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885487 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-kubelet\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885492 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b5da0aba-0c60-4b76-b8af-041b04e6fc2b-konnectivity-ca\") pod \"konnectivity-agent-6bwlp\" (UID: \"b5da0aba-0c60-4b76-b8af-041b04e6fc2b\") " pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885520 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-ovnkube-script-lib\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.885711 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885563 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-os-release\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885585 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-cni-bin\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885607 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-etc-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885633 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-k8s-cni-cncf-io\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885677 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-netns\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885703 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-daemon-config\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885705 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-iptables-alerter-script\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885728 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftrzd\" (UniqueName: \"kubernetes.io/projected/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-kube-api-access-ftrzd\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885754 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-socket-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885769 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-host\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885780 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-systemd-units\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885780 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bd60891-3cb2-4033-81b3-819a1fd45edd-tmp-dir\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885825 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885861 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd566bd9-42a8-48e9-889b-d01eee4488c2-host\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885885 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqgh4\" (UniqueName: \"kubernetes.io/projected/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-kube-api-access-qqgh4\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885913 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-systemd\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.886478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.885961 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-var-lib-kubelet\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886014 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-tmp\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886041 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-cni-netd\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886051 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bd60891-3cb2-4033-81b3-819a1fd45edd-tmp-dir\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886067 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-systemd\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.886083 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886094 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-run-ovn-kubernetes\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886122 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.886146 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs podName:9b07cfc0-68ca-4db2-bd1d-22319ff081b1 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:14.386126105 +0000 UTC m=+2.045788784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs") pod "network-metrics-daemon-dhkq5" (UID: "9b07cfc0-68ca-4db2-bd1d-22319ff081b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886212 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-systemd\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886213 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cd566bd9-42a8-48e9-889b-d01eee4488c2-serviceca\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886247 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886282 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/580807b1-acaf-4082-b5c8-ab84f495b516-ovn-node-metrics-cert\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886309 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-system-cni-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886336 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-cni-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886380 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-cnibin\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886436 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.887385 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886466 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmnvz\" (UniqueName: \"kubernetes.io/projected/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-kube-api-access-dmnvz\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886719 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-kubelet\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886731 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886750 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-sys-fs\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886780 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cnibin\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886805 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b5da0aba-0c60-4b76-b8af-041b04e6fc2b-konnectivity-ca\") pod \"konnectivity-agent-6bwlp\" (UID: \"b5da0aba-0c60-4b76-b8af-041b04e6fc2b\") " pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886817 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-var-lib-kubelet\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886781 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-cni-netd\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886867 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-kubernetes\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886870 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cnibin\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886808 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-kubernetes\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886923 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-node-log\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.886971 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-cni-bin\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887048 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-ovnkube-config\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887077 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-env-overrides\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887117 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-conf-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887147 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn962\" (UniqueName: \"kubernetes.io/projected/1bd60891-3cb2-4033-81b3-819a1fd45edd-kube-api-access-xn962\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.888275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887176 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cni-binary-copy\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887249 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-run-netns\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887274 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-log-socket\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887299 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-host-slash\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887325 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-run\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887326 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887376 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-lib-modules\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887422 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-node-log\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887484 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-ovnkube-script-lib\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887596 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-lib-modules\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887711 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887790 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-etc-openvswitch\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887849 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-run-ovn-kubernetes\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887875 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed2f027e-e531-41a7-8185-1a14d9f86cb2-cni-binary-copy\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887907 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-systemd\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887931 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-cni-bin\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887940 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-ovn\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889227 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.887996 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-cni-binary-copy\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888024 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-multus-certs\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888050 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-device-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888104 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lth4\" (UniqueName: \"kubernetes.io/projected/cd566bd9-42a8-48e9-889b-d01eee4488c2-kube-api-access-4lth4\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888134 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-system-cni-dir\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888160 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-os-release\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888186 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-sys\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888212 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zf2p\" (UniqueName: \"kubernetes.io/projected/580807b1-acaf-4082-b5c8-ab84f495b516-kube-api-access-7zf2p\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888240 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-registration-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888326 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-run-ovn\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888424 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-ovnkube-config\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888486 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-host-slash\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888550 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-run\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888602 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-system-cni-dir\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888663 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed2f027e-e531-41a7-8185-1a14d9f86cb2-os-release\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888705 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-host-run-netns\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888740 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/580807b1-acaf-4082-b5c8-ab84f495b516-log-socket\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.889879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888765 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-etc-tuned\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.890457 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888781 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-sys\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.890457 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.888782 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/580807b1-acaf-4082-b5c8-ab84f495b516-env-overrides\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.890457 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.889578 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b5da0aba-0c60-4b76-b8af-041b04e6fc2b-agent-certs\") pod \"konnectivity-agent-6bwlp\" (UID: \"b5da0aba-0c60-4b76-b8af-041b04e6fc2b\") " pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:13.890457 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.889676 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-tmp\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.891001 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.890983 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/580807b1-acaf-4082-b5c8-ab84f495b516-ovn-node-metrics-cert\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.894622 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.894590 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqgh4\" (UniqueName: \"kubernetes.io/projected/afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf-kube-api-access-qqgh4\") pod \"iptables-alerter-wqpqq\" (UID: \"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf\") " pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:13.894709 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.894690 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcjlf\" (UniqueName: \"kubernetes.io/projected/eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad-kube-api-access-vcjlf\") pod \"tuned-nhf2t\" (UID: \"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad\") " pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:13.895240 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.895222 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k75jj\" (UniqueName: \"kubernetes.io/projected/ed2f027e-e531-41a7-8185-1a14d9f86cb2-kube-api-access-k75jj\") pod \"multus-additional-cni-plugins-9z29g\" (UID: \"ed2f027e-e531-41a7-8185-1a14d9f86cb2\") " pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:13.898453 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.898252 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:13.898453 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.898289 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:13.898453 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.898324 2566 projected.go:194] Error preparing data for projected volume kube-api-access-p5jqj for pod openshift-network-diagnostics/network-check-target-kd7l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:13.899831 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:13.898754 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj podName:59a2e033-9cb8-4b1c-adf3-c0a5307d7e50 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:14.398705578 +0000 UTC m=+2.058368258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p5jqj" (UniqueName: "kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj") pod "network-check-target-kd7l2" (UID: "59a2e033-9cb8-4b1c-adf3-c0a5307d7e50") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:13.899831 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.899431 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lth4\" (UniqueName: \"kubernetes.io/projected/cd566bd9-42a8-48e9-889b-d01eee4488c2-kube-api-access-4lth4\") pod \"node-ca-jtlqx\" (UID: \"cd566bd9-42a8-48e9-889b-d01eee4488c2\") " pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:13.900742 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.900716 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zf2p\" (UniqueName: \"kubernetes.io/projected/580807b1-acaf-4082-b5c8-ab84f495b516-kube-api-access-7zf2p\") pod \"ovnkube-node-69b6q\" (UID: \"580807b1-acaf-4082-b5c8-ab84f495b516\") " pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:13.901165 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.901140 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmnvz\" (UniqueName: \"kubernetes.io/projected/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-kube-api-access-dmnvz\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:13.901640 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.901617 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn962\" (UniqueName: \"kubernetes.io/projected/1bd60891-3cb2-4033-81b3-819a1fd45edd-kube-api-access-xn962\") pod \"node-resolver-tjdkv\" (UID: \"1bd60891-3cb2-4033-81b3-819a1fd45edd\") " pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:13.931328 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.931303 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:13.975582 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:13.975541 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod905765c51ae51a6f36edf080ae0fc9ae.slice/crio-fb8244d8c120619b3471bbb7ef8cc3f194d965dfe86dd107c12b75ea1656b64b WatchSource:0}: Error finding container fb8244d8c120619b3471bbb7ef8cc3f194d965dfe86dd107c12b75ea1656b64b: Status 404 returned error can't find the container with id fb8244d8c120619b3471bbb7ef8cc3f194d965dfe86dd107c12b75ea1656b64b Apr 20 10:01:13.989077 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989045 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-etc-selinux\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.989077 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989079 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-socket-dir-parent\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989233 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989098 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-hostroot\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989233 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989186 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-etc-selinux\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.989233 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989186 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-socket-dir-parent\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989233 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989215 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c64lg\" (UniqueName: \"kubernetes.io/projected/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-kube-api-access-c64lg\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.989391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989230 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-hostroot\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.989391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989274 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-cni-multus\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.989391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989317 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-os-release\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989330 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-cni-multus\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989360 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-cni-bin\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989388 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-k8s-cni-cncf-io\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989408 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-os-release\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989414 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-netns\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-daemon-config\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989454 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-cni-bin\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989448 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-netns\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989475 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftrzd\" (UniqueName: \"kubernetes.io/projected/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-kube-api-access-ftrzd\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989479 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-k8s-cni-cncf-io\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989504 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-socket-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989559 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-system-cni-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989575 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-cni-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989590 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-cnibin\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989610 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-socket-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989606 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-kubelet\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989655 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-sys-fs\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989663 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-var-lib-kubelet\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-cni-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989733 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-conf-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.989748 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989733 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-sys-fs\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989748 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-cnibin\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989771 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-cni-binary-copy\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989791 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-system-cni-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989800 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-multus-certs\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989826 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-device-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989845 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-conf-dir\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989920 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-registration-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989931 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-device-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989948 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-etc-kubernetes\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989935 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-host-run-multus-certs\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.989987 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-registration-dir\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:13.990428 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.990035 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-etc-kubernetes\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990846 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.990530 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-multus-daemon-config\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.990846 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.990705 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-cni-binary-copy\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.999231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.999208 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftrzd\" (UniqueName: \"kubernetes.io/projected/3e96c2c0-000c-46ed-b65a-85a4c7b0ea18-kube-api-access-ftrzd\") pod \"multus-smhx6\" (UID: \"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18\") " pod="openshift-multus/multus-smhx6" Apr 20 10:01:13.999374 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:13.999296 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c64lg\" (UniqueName: \"kubernetes.io/projected/3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7-kube-api-access-c64lg\") pod \"aws-ebs-csi-driver-node-76tbd\" (UID: \"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:14.089452 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.089420 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jtlqx" Apr 20 10:01:14.095701 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.095539 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd566bd9_42a8_48e9_889b_d01eee4488c2.slice/crio-09932648e3ce000527f2198f965b4424823ace8920d40875d46a5c5121f98dd4 WatchSource:0}: Error finding container 09932648e3ce000527f2198f965b4424823ace8920d40875d46a5c5121f98dd4: Status 404 returned error can't find the container with id 09932648e3ce000527f2198f965b4424823ace8920d40875d46a5c5121f98dd4 Apr 20 10:01:14.096630 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.096608 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:14.103041 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.103016 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5da0aba_0c60_4b76_b8af_041b04e6fc2b.slice/crio-2634a7ade4c6e8485a12bd3bc7ca2af9c612fe1bbf00cbbb28f7c9e849fe9648 WatchSource:0}: Error finding container 2634a7ade4c6e8485a12bd3bc7ca2af9c612fe1bbf00cbbb28f7c9e849fe9648: Status 404 returned error can't find the container with id 2634a7ade4c6e8485a12bd3bc7ca2af9c612fe1bbf00cbbb28f7c9e849fe9648 Apr 20 10:01:14.107791 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.107775 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" Apr 20 10:01:14.114168 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.114145 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb2dd1bf_0a1d_42f6_8271_5ccafdf2c9ad.slice/crio-7d58eb085bb1ede5a1acd7cbf5364db4e5919478dbea677e83b21b5853dd96f4 WatchSource:0}: Error finding container 7d58eb085bb1ede5a1acd7cbf5364db4e5919478dbea677e83b21b5853dd96f4: Status 404 returned error can't find the container with id 7d58eb085bb1ede5a1acd7cbf5364db4e5919478dbea677e83b21b5853dd96f4 Apr 20 10:01:14.126492 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.126464 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tjdkv" Apr 20 10:01:14.135298 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.135266 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bd60891_3cb2_4033_81b3_819a1fd45edd.slice/crio-dac0044da53f178ca46c45ef57a3543f27642b1404e472fcefd8d482723c4c49 WatchSource:0}: Error finding container dac0044da53f178ca46c45ef57a3543f27642b1404e472fcefd8d482723c4c49: Status 404 returned error can't find the container with id dac0044da53f178ca46c45ef57a3543f27642b1404e472fcefd8d482723c4c49 Apr 20 10:01:14.143450 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.143431 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9z29g" Apr 20 10:01:14.149283 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.149258 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded2f027e_e531_41a7_8185_1a14d9f86cb2.slice/crio-c76387a2cec317eadca698692dc54fe66e36d2ff7d05e5a33b19e91a7359f5f2 WatchSource:0}: Error finding container c76387a2cec317eadca698692dc54fe66e36d2ff7d05e5a33b19e91a7359f5f2: Status 404 returned error can't find the container with id c76387a2cec317eadca698692dc54fe66e36d2ff7d05e5a33b19e91a7359f5f2 Apr 20 10:01:14.163135 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.163109 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wqpqq" Apr 20 10:01:14.168934 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.168908 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb0fd43_aa3d_4c27_aea8_0ac8c7dbd6bf.slice/crio-7e80bfec8cf502e230b7b4748c1b922d3f3addb84abbaa0b7709fbf05dda5c53 WatchSource:0}: Error finding container 7e80bfec8cf502e230b7b4748c1b922d3f3addb84abbaa0b7709fbf05dda5c53: Status 404 returned error can't find the container with id 7e80bfec8cf502e230b7b4748c1b922d3f3addb84abbaa0b7709fbf05dda5c53 Apr 20 10:01:14.186321 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.186291 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:14.192037 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.192015 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod580807b1_acaf_4082_b5c8_ab84f495b516.slice/crio-5173c8abf6008997f060787aaa2936b7c81efe377423a3d87b85c539df36dec4 WatchSource:0}: Error finding container 5173c8abf6008997f060787aaa2936b7c81efe377423a3d87b85c539df36dec4: Status 404 returned error can't find the container with id 5173c8abf6008997f060787aaa2936b7c81efe377423a3d87b85c539df36dec4 Apr 20 10:01:14.195047 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.195021 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" Apr 20 10:01:14.201428 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.201404 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b29027d_94f9_4f50_ad5f_5c3c34aa1bf7.slice/crio-09736369b8ce7be6c9e09d51ba18af2fc1e021554d81f2161b06c540a9521d0f WatchSource:0}: Error finding container 09736369b8ce7be6c9e09d51ba18af2fc1e021554d81f2161b06c540a9521d0f: Status 404 returned error can't find the container with id 09736369b8ce7be6c9e09d51ba18af2fc1e021554d81f2161b06c540a9521d0f Apr 20 10:01:14.202206 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.202190 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-smhx6" Apr 20 10:01:14.208126 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:14.208102 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e96c2c0_000c_46ed_b65a_85a4c7b0ea18.slice/crio-26c16cdffef27e3e9b4880f0a2cd6767ec6bb83bf41377c4aeb66294fd6dc803 WatchSource:0}: Error finding container 26c16cdffef27e3e9b4880f0a2cd6767ec6bb83bf41377c4aeb66294fd6dc803: Status 404 returned error can't find the container with id 26c16cdffef27e3e9b4880f0a2cd6767ec6bb83bf41377c4aeb66294fd6dc803 Apr 20 10:01:14.338870 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.338840 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:14.350270 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.350168 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cmdzp"] Apr 20 10:01:14.353438 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.351962 2566 predicate.go:212] "Predicate failed on Pod" pod="kube-system/global-pull-secret-syncer-cmdzp" err="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 20 10:01:14.353438 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.351992 2566 kubelet.go:2420] "Pod admission denied" podUID="4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2" pod="kube-system/global-pull-secret-syncer-cmdzp" reason="NodeAffinity" message="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 20 10:01:14.394358 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.394311 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-dbus\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:14.394524 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.394372 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:14.394524 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.394437 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:14.394603 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:14.394549 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:14.394603 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.394553 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-kubelet-config\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:14.394672 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:14.394615 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs podName:9b07cfc0-68ca-4db2-bd1d-22319ff081b1 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:15.394597413 +0000 UTC m=+3.054260086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs") pod "network-metrics-daemon-dhkq5" (UID: "9b07cfc0-68ca-4db2-bd1d-22319ff081b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:14.495080 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.495038 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:14.495242 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.495119 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:14.495242 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.495161 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-kubelet-config\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:14.495242 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.495201 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-dbus\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:14.495401 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:14.495304 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:14.495401 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.495362 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-kubelet-config\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:14.495401 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.495381 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-dbus\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:14.495401 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:14.495370 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:14.495537 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:14.495400 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret podName:4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:14.995379969 +0000 UTC m=+2.655042652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret") pod "global-pull-secret-syncer-cmdzp" (UID: "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:14.495537 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:14.495420 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:14.495537 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:14.495438 2566 projected.go:194] Error preparing data for projected volume kube-api-access-p5jqj for pod openshift-network-diagnostics/network-check-target-kd7l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:14.495537 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:14.495508 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj podName:59a2e033-9cb8-4b1c-adf3-c0a5307d7e50 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:15.495490773 +0000 UTC m=+3.155153449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5jqj" (UniqueName: "kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj") pod "network-check-target-kd7l2" (UID: "59a2e033-9cb8-4b1c-adf3-c0a5307d7e50") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:14.794290 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.794254 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:14.821111 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.821071 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 09:56:13 +0000 UTC" deadline="2027-09-23 09:00:08.790684317 +0000 UTC" Apr 20 10:01:14.821111 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.821106 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12502h58m53.969582415s" Apr 20 10:01:14.893326 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.893259 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6bwlp" event={"ID":"b5da0aba-0c60-4b76-b8af-041b04e6fc2b","Type":"ContainerStarted","Data":"2634a7ade4c6e8485a12bd3bc7ca2af9c612fe1bbf00cbbb28f7c9e849fe9648"} Apr 20 10:01:14.900659 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.900629 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-dbus\") pod \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " Apr 20 10:01:14.900832 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.900672 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-kubelet-config\") pod \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " Apr 20 10:01:14.900897 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.900855 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-kubelet-config" (OuterVolumeSpecName: "kubelet-config") pod "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2" (UID: "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2"). InnerVolumeSpecName "kubelet-config". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 20 10:01:14.900946 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.900896 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-dbus" (OuterVolumeSpecName: "dbus") pod "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2" (UID: "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2"). InnerVolumeSpecName "dbus". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 20 10:01:14.902552 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.902522 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jtlqx" event={"ID":"cd566bd9-42a8-48e9-889b-d01eee4488c2","Type":"ContainerStarted","Data":"09932648e3ce000527f2198f965b4424823ace8920d40875d46a5c5121f98dd4"} Apr 20 10:01:14.904465 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.904431 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-smhx6" event={"ID":"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18","Type":"ContainerStarted","Data":"26c16cdffef27e3e9b4880f0a2cd6767ec6bb83bf41377c4aeb66294fd6dc803"} Apr 20 10:01:14.913882 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.913818 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" event={"ID":"905765c51ae51a6f36edf080ae0fc9ae","Type":"ContainerStarted","Data":"fb8244d8c120619b3471bbb7ef8cc3f194d965dfe86dd107c12b75ea1656b64b"} Apr 20 10:01:14.923060 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.923018 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" event={"ID":"daadd8592f265af4bb30938041dae753","Type":"ContainerStarted","Data":"0b5ec53ae759ab86901351ea3a23d599b1579a6e22c7ac4d292197be162fee86"} Apr 20 10:01:14.941427 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.939854 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" event={"ID":"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7","Type":"ContainerStarted","Data":"09736369b8ce7be6c9e09d51ba18af2fc1e021554d81f2161b06c540a9521d0f"} Apr 20 10:01:14.964828 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.964715 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"5173c8abf6008997f060787aaa2936b7c81efe377423a3d87b85c539df36dec4"} Apr 20 10:01:14.991482 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:14.991441 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wqpqq" event={"ID":"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf","Type":"ContainerStarted","Data":"7e80bfec8cf502e230b7b4748c1b922d3f3addb84abbaa0b7709fbf05dda5c53"} Apr 20 10:01:15.001856 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.001707 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:15.001856 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.001800 2566 reconciler_common.go:299] "Volume detached for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-kubelet-config\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:01:15.001856 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.001817 2566 reconciler_common.go:299] "Volume detached for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-dbus\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:01:15.001856 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.001851 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:15.002204 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.001932 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret podName:4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:16.001912188 +0000 UTC m=+3.661574848 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret") pod "global-pull-secret-syncer-cmdzp" (UID: "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:15.010215 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.010166 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9z29g" event={"ID":"ed2f027e-e531-41a7-8185-1a14d9f86cb2","Type":"ContainerStarted","Data":"c76387a2cec317eadca698692dc54fe66e36d2ff7d05e5a33b19e91a7359f5f2"} Apr 20 10:01:15.043068 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.042935 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tjdkv" event={"ID":"1bd60891-3cb2-4033-81b3-819a1fd45edd","Type":"ContainerStarted","Data":"dac0044da53f178ca46c45ef57a3543f27642b1404e472fcefd8d482723c4c49"} Apr 20 10:01:15.061689 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.061559 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" event={"ID":"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad","Type":"ContainerStarted","Data":"7d58eb085bb1ede5a1acd7cbf5364db4e5919478dbea677e83b21b5853dd96f4"} Apr 20 10:01:15.405655 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.405542 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:15.405821 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.405728 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:15.405821 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.405796 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs podName:9b07cfc0-68ca-4db2-bd1d-22319ff081b1 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:17.405775936 +0000 UTC m=+5.065438601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs") pod "network-metrics-daemon-dhkq5" (UID: "9b07cfc0-68ca-4db2-bd1d-22319ff081b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:15.506495 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.506427 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:15.506697 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.506678 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:15.506759 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.506702 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:15.506759 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.506716 2566 projected.go:194] Error preparing data for projected volume kube-api-access-p5jqj for pod openshift-network-diagnostics/network-check-target-kd7l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:15.506872 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.506776 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj podName:59a2e033-9cb8-4b1c-adf3-c0a5307d7e50 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:17.506758013 +0000 UTC m=+5.166420677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5jqj" (UniqueName: "kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj") pod "network-check-target-kd7l2" (UID: "59a2e033-9cb8-4b1c-adf3-c0a5307d7e50") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:15.821571 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.821482 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 09:56:13 +0000 UTC" deadline="2027-12-11 12:07:34.561754835 +0000 UTC" Apr 20 10:01:15.821571 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.821525 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14402h6m18.740234611s" Apr 20 10:01:15.856126 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.855318 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:15.856126 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.855474 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:15.856126 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.855960 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:15.856126 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:15.856070 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:15.960929 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:15.960890 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:16.014182 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:16.014143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:16.014400 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:16.014341 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:16.014483 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:16.014419 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret podName:4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:18.014401836 +0000 UTC m=+5.674064501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret") pod "global-pull-secret-syncer-cmdzp" (UID: "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:17.428305 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:17.428129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:17.428917 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:17.428325 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:17.428917 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:17.428409 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs podName:9b07cfc0-68ca-4db2-bd1d-22319ff081b1 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:21.428391289 +0000 UTC m=+9.088053951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs") pod "network-metrics-daemon-dhkq5" (UID: "9b07cfc0-68ca-4db2-bd1d-22319ff081b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:17.529135 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:17.528909 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:17.529135 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:17.529079 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:17.529135 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:17.529100 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:17.529135 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:17.529113 2566 projected.go:194] Error preparing data for projected volume kube-api-access-p5jqj for pod openshift-network-diagnostics/network-check-target-kd7l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:17.529644 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:17.529171 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj podName:59a2e033-9cb8-4b1c-adf3-c0a5307d7e50 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:21.529152655 +0000 UTC m=+9.188815315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5jqj" (UniqueName: "kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj") pod "network-check-target-kd7l2" (UID: "59a2e033-9cb8-4b1c-adf3-c0a5307d7e50") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:17.856058 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:17.855500 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:17.856058 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:17.855540 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:17.856058 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:17.855635 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:17.856058 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:17.855777 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:18.032012 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:18.031973 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:18.032196 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:18.032123 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:18.032196 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:18.032190 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret podName:4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:22.032170703 +0000 UTC m=+9.691833376 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret") pod "global-pull-secret-syncer-cmdzp" (UID: "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:19.855849 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:19.855260 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:19.855849 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:19.855423 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:19.855849 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:19.855259 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:19.855849 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:19.855756 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:21.461072 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:21.460996 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:21.461568 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:21.461147 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:21.461568 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:21.461220 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs podName:9b07cfc0-68ca-4db2-bd1d-22319ff081b1 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:29.461199047 +0000 UTC m=+17.120861707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs") pod "network-metrics-daemon-dhkq5" (UID: "9b07cfc0-68ca-4db2-bd1d-22319ff081b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:21.562082 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:21.562040 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:21.562241 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:21.562225 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:21.562304 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:21.562250 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:21.562304 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:21.562264 2566 projected.go:194] Error preparing data for projected volume kube-api-access-p5jqj for pod openshift-network-diagnostics/network-check-target-kd7l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:21.562448 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:21.562328 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj podName:59a2e033-9cb8-4b1c-adf3-c0a5307d7e50 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:29.562307758 +0000 UTC m=+17.221970633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5jqj" (UniqueName: "kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj") pod "network-check-target-kd7l2" (UID: "59a2e033-9cb8-4b1c-adf3-c0a5307d7e50") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:21.855400 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:21.854997 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:21.855400 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:21.855013 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:21.855400 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:21.855131 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:21.855400 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:21.855218 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:22.066678 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.066644 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret\") pod \"global-pull-secret-syncer-cmdzp\" (UID: \"4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2\") " pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:22.066862 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:22.066812 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:22.066950 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:22.066876 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret podName:4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:30.066857027 +0000 UTC m=+17.726519699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret") pod "global-pull-secret-syncer-cmdzp" (UID: "4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:22.877100 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.877011 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kube-system/global-pull-secret-syncer-cmdzp"] Apr 20 10:01:22.877618 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.877135 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmdzp" Apr 20 10:01:22.880559 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.880510 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kube-system/global-pull-secret-syncer-cmdzp"] Apr 20 10:01:22.881444 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.881398 2566 status_manager.go:895] "Failed to get status for pod" podUID="4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2" pod="kube-system/global-pull-secret-syncer-cmdzp" err="pods \"global-pull-secret-syncer-cmdzp\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:01:22.903180 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.903091 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-94htn"] Apr 20 10:01:22.909842 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.909805 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:22.909998 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:22.909905 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:22.911758 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.911716 2566 status_manager.go:895] "Failed to get status for pod" podUID="4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2" pod="kube-system/global-pull-secret-syncer-cmdzp" err="pods \"global-pull-secret-syncer-cmdzp\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:01:22.974340 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.974309 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0b10ec74-afc5-4519-a053-44766e5a7624-dbus\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:22.974508 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.974437 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0b10ec74-afc5-4519-a053-44766e5a7624-kubelet-config\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:22.974508 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.974467 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:22.974508 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:22.974502 2566 reconciler_common.go:299] "Volume detached for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4598dc8a-cd3a-4ed2-b84d-a4cac6b61fa2-original-pull-secret\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:01:23.075072 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:23.074997 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0b10ec74-afc5-4519-a053-44766e5a7624-kubelet-config\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:23.075072 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:23.075048 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:23.075072 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:23.075079 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0b10ec74-afc5-4519-a053-44766e5a7624-dbus\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:23.075389 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:23.075143 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0b10ec74-afc5-4519-a053-44766e5a7624-kubelet-config\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:23.075389 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:23.075178 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0b10ec74-afc5-4519-a053-44766e5a7624-dbus\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:23.075389 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:23.075226 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:23.075389 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:23.075289 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret podName:0b10ec74-afc5-4519-a053-44766e5a7624 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:23.575270219 +0000 UTC m=+11.234932879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret") pod "global-pull-secret-syncer-94htn" (UID: "0b10ec74-afc5-4519-a053-44766e5a7624") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:23.579813 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:23.579763 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:23.579990 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:23.579928 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:23.580068 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:23.580008 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret podName:0b10ec74-afc5-4519-a053-44766e5a7624 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:24.579987716 +0000 UTC m=+12.239650384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret") pod "global-pull-secret-syncer-94htn" (UID: "0b10ec74-afc5-4519-a053-44766e5a7624") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:23.855658 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:23.855576 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:23.855826 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:23.855583 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:23.855826 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:23.855677 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:23.855826 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:23.855796 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:24.587680 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:24.587638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:24.588082 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:24.587796 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:24.588082 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:24.587870 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret podName:0b10ec74-afc5-4519-a053-44766e5a7624 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:26.587853522 +0000 UTC m=+14.247516186 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret") pod "global-pull-secret-syncer-94htn" (UID: "0b10ec74-afc5-4519-a053-44766e5a7624") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:24.855264 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:24.855178 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:24.855457 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:24.855300 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:25.855534 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:25.855496 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:25.855935 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:25.855499 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:25.855935 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:25.855608 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:25.855935 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:25.855710 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:26.601934 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:26.601900 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:26.602121 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:26.602046 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:26.602185 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:26.602147 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret podName:0b10ec74-afc5-4519-a053-44766e5a7624 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:30.602100111 +0000 UTC m=+18.261762784 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret") pod "global-pull-secret-syncer-94htn" (UID: "0b10ec74-afc5-4519-a053-44766e5a7624") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:26.855084 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:26.854993 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:26.855257 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:26.855130 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:27.855534 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:27.855455 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:27.855951 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:27.855466 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:27.855951 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:27.855578 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:27.855951 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:27.855701 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:28.855217 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:28.855179 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:28.855417 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:28.855317 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:29.523792 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:29.523754 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:29.524310 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:29.523932 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:29.524310 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:29.524004 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs podName:9b07cfc0-68ca-4db2-bd1d-22319ff081b1 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:45.523987711 +0000 UTC m=+33.183650375 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs") pod "network-metrics-daemon-dhkq5" (UID: "9b07cfc0-68ca-4db2-bd1d-22319ff081b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:29.624362 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:29.624282 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:29.624556 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:29.624492 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:29.624556 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:29.624513 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:29.624556 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:29.624523 2566 projected.go:194] Error preparing data for projected volume kube-api-access-p5jqj for pod openshift-network-diagnostics/network-check-target-kd7l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:29.624710 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:29.624576 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj podName:59a2e033-9cb8-4b1c-adf3-c0a5307d7e50 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:45.624563128 +0000 UTC m=+33.284225786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5jqj" (UniqueName: "kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj") pod "network-check-target-kd7l2" (UID: "59a2e033-9cb8-4b1c-adf3-c0a5307d7e50") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:29.855035 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:29.854958 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:29.855035 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:29.854976 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:29.855232 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:29.855067 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:29.855232 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:29.855216 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:30.632160 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:30.632111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:30.632664 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:30.632271 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:30.632664 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:30.632339 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret podName:0b10ec74-afc5-4519-a053-44766e5a7624 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:38.632317037 +0000 UTC m=+26.291979710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret") pod "global-pull-secret-syncer-94htn" (UID: "0b10ec74-afc5-4519-a053-44766e5a7624") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:30.855752 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:30.855723 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:30.855926 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:30.855848 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:31.855463 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:31.855421 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:31.855918 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:31.855423 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:31.855918 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:31.855566 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:31.855918 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:31.855654 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:32.855934 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:32.855762 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:32.856553 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:32.855996 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:33.099665 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.099626 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tjdkv" event={"ID":"1bd60891-3cb2-4033-81b3-819a1fd45edd","Type":"ContainerStarted","Data":"f34650c212244ae5c5d85d95b303c1303688a22ee013133bb2e5bd8f0ddb2426"} Apr 20 10:01:33.100956 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.100926 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" event={"ID":"eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad","Type":"ContainerStarted","Data":"61ab2da146ccba61f7cbffa3e5522cca015efe8c185a27caa9eed36d0c207e69"} Apr 20 10:01:33.102145 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.102113 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6bwlp" event={"ID":"b5da0aba-0c60-4b76-b8af-041b04e6fc2b","Type":"ContainerStarted","Data":"e7bc9b2a17592dac002772e6c2d659717190847b01c0b2e9ce7a12eaf370d332"} Apr 20 10:01:33.103263 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.103242 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jtlqx" event={"ID":"cd566bd9-42a8-48e9-889b-d01eee4488c2","Type":"ContainerStarted","Data":"6e4548a64f1141fb979561b04910dc1b10db45041cfb705de332b55def84e21b"} Apr 20 10:01:33.104516 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.104493 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-smhx6" event={"ID":"3e96c2c0-000c-46ed-b65a-85a4c7b0ea18","Type":"ContainerStarted","Data":"d2467b74725769d9be620195c747e95f8f86342edc2e33570066e999f0d19d6f"} Apr 20 10:01:33.105757 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.105731 2566 generic.go:358] "Generic (PLEG): container finished" podID="905765c51ae51a6f36edf080ae0fc9ae" containerID="421ce86bd62ef30582b796024c94126d11ac7bbb4a34751deb37c6ace7c1f114" exitCode=0 Apr 20 10:01:33.105854 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.105804 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" event={"ID":"905765c51ae51a6f36edf080ae0fc9ae","Type":"ContainerDied","Data":"421ce86bd62ef30582b796024c94126d11ac7bbb4a34751deb37c6ace7c1f114"} Apr 20 10:01:33.105920 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.105906 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 20 10:01:33.107122 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.107099 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" event={"ID":"daadd8592f265af4bb30938041dae753","Type":"ContainerStarted","Data":"85802a0df98de5d8506f1448663cd78cb63de8e0bfc41ae6e6fad67396ecb15e"} Apr 20 10:01:33.108275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.108258 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" event={"ID":"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7","Type":"ContainerStarted","Data":"3cf71f2200312b5120f627176b249df144acf559dcb3a9b529899859ed4d0d51"} Apr 20 10:01:33.110586 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.110572 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:01:33.110899 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.110882 2566 generic.go:358] "Generic (PLEG): container finished" podID="580807b1-acaf-4082-b5c8-ab84f495b516" containerID="37e6047ed295dde87d1ab92ffd0e1177d117f3e3b8aa23622b5e0ee3c7abf94f" exitCode=1 Apr 20 10:01:33.110959 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.110945 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"abc2fd6da69d89dbbc4b079f06921fa4d8f892d36c4638f0ec65dafa01a85a76"} Apr 20 10:01:33.110997 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.110966 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"b3d58802144a03239028a02ad1f97df3166a886fb33171e899aafcb4a992cb86"} Apr 20 10:01:33.110997 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.110976 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"4a2777e4024fd631eeb7d84de6c31c470d54ce3fc78bf43343aa58545c10f887"} Apr 20 10:01:33.110997 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.110985 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"6813ae0a2416befca15041323c2e4f6a60c83e40a918c8696c5d791f1988f599"} Apr 20 10:01:33.111091 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.110996 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerDied","Data":"37e6047ed295dde87d1ab92ffd0e1177d117f3e3b8aa23622b5e0ee3c7abf94f"} Apr 20 10:01:33.111091 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.111011 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"035ac5fd827ae6cea0d1b523862bbfdcb982e258a6400e2439d9feb0ed87ff2a"} Apr 20 10:01:33.112148 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.112130 2566 generic.go:358] "Generic (PLEG): container finished" podID="ed2f027e-e531-41a7-8185-1a14d9f86cb2" containerID="cce966e44ffed651f57f85ddb8c78dd661358b57f3a978113ec3ca3f21bc8c22" exitCode=0 Apr 20 10:01:33.112204 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.112160 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9z29g" event={"ID":"ed2f027e-e531-41a7-8185-1a14d9f86cb2","Type":"ContainerDied","Data":"cce966e44ffed651f57f85ddb8c78dd661358b57f3a978113ec3ca3f21bc8c22"} Apr 20 10:01:33.116262 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.116247 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 10:01:33.117198 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.117167 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal"] Apr 20 10:01:33.122448 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.122407 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tjdkv" podStartSLOduration=2.29573343 podStartE2EDuration="20.122395553s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.136945438 +0000 UTC m=+1.796608096" lastFinishedPulling="2026-04-20 10:01:31.96360755 +0000 UTC m=+19.623270219" observedRunningTime="2026-04-20 10:01:33.121913322 +0000 UTC m=+20.781576002" watchObservedRunningTime="2026-04-20 10:01:33.122395553 +0000 UTC m=+20.782058233" Apr 20 10:01:33.154460 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.154404 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" podStartSLOduration=20.154383948 podStartE2EDuration="20.154383948s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:01:33.154340376 +0000 UTC m=+20.814003057" watchObservedRunningTime="2026-04-20 10:01:33.154383948 +0000 UTC m=+20.814046633" Apr 20 10:01:33.170862 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.170795 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6bwlp" podStartSLOduration=3.316613708 podStartE2EDuration="21.170775532s" podCreationTimestamp="2026-04-20 10:01:12 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.104395932 +0000 UTC m=+1.764058591" lastFinishedPulling="2026-04-20 10:01:31.958557756 +0000 UTC m=+19.618220415" observedRunningTime="2026-04-20 10:01:33.170313194 +0000 UTC m=+20.829975876" watchObservedRunningTime="2026-04-20 10:01:33.170775532 +0000 UTC m=+20.830438213" Apr 20 10:01:33.188540 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.188477 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-smhx6" podStartSLOduration=2.401701347 podStartE2EDuration="20.188455483s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.209758833 +0000 UTC m=+1.869421492" lastFinishedPulling="2026-04-20 10:01:31.996512948 +0000 UTC m=+19.656175628" observedRunningTime="2026-04-20 10:01:33.18786058 +0000 UTC m=+20.847523262" watchObservedRunningTime="2026-04-20 10:01:33.188455483 +0000 UTC m=+20.848118166" Apr 20 10:01:33.224636 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.224573 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nhf2t" podStartSLOduration=2.344617874 podStartE2EDuration="20.224556603s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.115830014 +0000 UTC m=+1.775492672" lastFinishedPulling="2026-04-20 10:01:31.995768736 +0000 UTC m=+19.655431401" observedRunningTime="2026-04-20 10:01:33.224224692 +0000 UTC m=+20.883887373" watchObservedRunningTime="2026-04-20 10:01:33.224556603 +0000 UTC m=+20.884219284" Apr 20 10:01:33.224786 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.224714 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jtlqx" podStartSLOduration=3.357823827 podStartE2EDuration="21.224708865s" podCreationTimestamp="2026-04-20 10:01:12 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.097212302 +0000 UTC m=+1.756874961" lastFinishedPulling="2026-04-20 10:01:31.964097333 +0000 UTC m=+19.623759999" observedRunningTime="2026-04-20 10:01:33.202192113 +0000 UTC m=+20.861854794" watchObservedRunningTime="2026-04-20 10:01:33.224708865 +0000 UTC m=+20.884371543" Apr 20 10:01:33.757584 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.757540 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 10:01:33.855215 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.855171 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:33.855399 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.855171 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:33.855399 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:33.855306 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:33.855399 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.855107 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T10:01:33.757564515Z","UUID":"d9d86f44-8c03-406d-8b51-7df91530fc5e","Handler":null,"Name":"","Endpoint":""} Apr 20 10:01:33.855399 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:33.855383 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:33.856673 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.856652 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 10:01:33.856673 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:33.856677 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 10:01:34.116078 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:34.116041 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" event={"ID":"905765c51ae51a6f36edf080ae0fc9ae","Type":"ContainerStarted","Data":"f89a40b28e195dd44bb810a6c6b0b87619870fb48502c2d0fdc66784c67ad1c2"} Apr 20 10:01:34.117915 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:34.117889 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" event={"ID":"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7","Type":"ContainerStarted","Data":"66b48c6b8dcfac748eb58bfe2465cc439bf23d2210c93160183f4169c856c7a9"} Apr 20 10:01:34.119368 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:34.119270 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wqpqq" event={"ID":"afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf","Type":"ContainerStarted","Data":"6251d504f1748b5217f542c2bff5d9794eece9a26135830c0dd3ff3631493882"} Apr 20 10:01:34.132956 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:34.132900 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" podStartSLOduration=1.132885144 podStartE2EDuration="1.132885144s" podCreationTimestamp="2026-04-20 10:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:01:34.132530938 +0000 UTC m=+21.792193618" watchObservedRunningTime="2026-04-20 10:01:34.132885144 +0000 UTC m=+21.792547826" Apr 20 10:01:34.148643 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:34.148587 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wqpqq" podStartSLOduration=3.35518017 podStartE2EDuration="21.148574036s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.170533763 +0000 UTC m=+1.830196422" lastFinishedPulling="2026-04-20 10:01:31.96392763 +0000 UTC m=+19.623590288" observedRunningTime="2026-04-20 10:01:34.148013762 +0000 UTC m=+21.807676445" watchObservedRunningTime="2026-04-20 10:01:34.148574036 +0000 UTC m=+21.808236718" Apr 20 10:01:34.855702 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:34.855477 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:34.855891 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:34.855760 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:35.123677 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:35.123540 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" event={"ID":"3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7","Type":"ContainerStarted","Data":"0dc9639173a9eff2b3b7fce8e4b6737073d203801dd9758116903cc783eaea92"} Apr 20 10:01:35.126612 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:35.126583 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:01:35.127146 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:35.127107 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"a85df015563dd87f19df0da3e1d61f9dd714a29bc81c4073b430301407175f6b"} Apr 20 10:01:35.143078 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:35.143006 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tbd" podStartSLOduration=1.486188515 podStartE2EDuration="22.142985624s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.203153859 +0000 UTC m=+1.862816518" lastFinishedPulling="2026-04-20 10:01:34.859950953 +0000 UTC m=+22.519613627" observedRunningTime="2026-04-20 10:01:35.14293152 +0000 UTC m=+22.802594201" watchObservedRunningTime="2026-04-20 10:01:35.142985624 +0000 UTC m=+22.802648306" Apr 20 10:01:35.855548 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:35.855504 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:35.855548 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:35.855543 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:35.855806 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:35.855694 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:35.855806 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:35.855727 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:36.855470 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:36.855422 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:36.856105 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:36.855542 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:37.855421 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:37.855226 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:37.855599 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:37.855226 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:37.855599 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:37.855519 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:37.855599 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:37.855559 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:37.979782 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:37.979753 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:37.980369 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:37.980322 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:38.138029 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.137945 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:01:38.138335 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.138310 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"b5819535facee840e1b2592ec8c8be984e5db6aeaf19df7059faec9dcf78efbe"} Apr 20 10:01:38.138637 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.138619 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:38.138838 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.138820 2566 scope.go:117] "RemoveContainer" containerID="37e6047ed295dde87d1ab92ffd0e1177d117f3e3b8aa23622b5e0ee3c7abf94f" Apr 20 10:01:38.140137 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.140095 2566 generic.go:358] "Generic (PLEG): container finished" podID="ed2f027e-e531-41a7-8185-1a14d9f86cb2" containerID="9e443ffc6d83b9e095a0fcafc97d075b419746bb719b21412c7fc8205c3086f7" exitCode=0 Apr 20 10:01:38.140200 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.140169 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9z29g" event={"ID":"ed2f027e-e531-41a7-8185-1a14d9f86cb2","Type":"ContainerDied","Data":"9e443ffc6d83b9e095a0fcafc97d075b419746bb719b21412c7fc8205c3086f7"} Apr 20 10:01:38.140403 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.140388 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:38.140908 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.140880 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6bwlp" Apr 20 10:01:38.154635 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.154614 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:38.693602 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.693557 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:38.693813 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:38.693690 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:38.693813 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:38.693762 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret podName:0b10ec74-afc5-4519-a053-44766e5a7624 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:54.693746303 +0000 UTC m=+42.353408967 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret") pod "global-pull-secret-syncer-94htn" (UID: "0b10ec74-afc5-4519-a053-44766e5a7624") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:38.855279 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:38.855250 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:38.855449 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:38.855426 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:39.145479 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.145291 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:01:39.145897 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.145824 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" event={"ID":"580807b1-acaf-4082-b5c8-ab84f495b516","Type":"ContainerStarted","Data":"280bcbae8de267c4b7246f334dcb61550062d4ccf8a7f71d318d02a37e5dcbcb"} Apr 20 10:01:39.145969 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.145954 2566 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 10:01:39.146208 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.146186 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:39.147844 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.147821 2566 generic.go:358] "Generic (PLEG): container finished" podID="ed2f027e-e531-41a7-8185-1a14d9f86cb2" containerID="5a0a34ce8a66b2b89b99b982ec851fa554ae5a47ceb2392df7efde121b3e2aa8" exitCode=0 Apr 20 10:01:39.147939 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.147919 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9z29g" event={"ID":"ed2f027e-e531-41a7-8185-1a14d9f86cb2","Type":"ContainerDied","Data":"5a0a34ce8a66b2b89b99b982ec851fa554ae5a47ceb2392df7efde121b3e2aa8"} Apr 20 10:01:39.161160 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.161138 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:39.184457 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.184412 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" podStartSLOduration=8.314844222 podStartE2EDuration="26.184396433s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.193633138 +0000 UTC m=+1.853295800" lastFinishedPulling="2026-04-20 10:01:32.063185351 +0000 UTC m=+19.722848011" observedRunningTime="2026-04-20 10:01:39.182072917 +0000 UTC m=+26.841735598" watchObservedRunningTime="2026-04-20 10:01:39.184396433 +0000 UTC m=+26.844059114" Apr 20 10:01:39.195069 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.195039 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-94htn"] Apr 20 10:01:39.195218 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.195129 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:39.195267 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:39.195213 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:39.203909 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.203841 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dhkq5"] Apr 20 10:01:39.204023 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.203951 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:39.204071 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:39.204031 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:39.214461 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.214435 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kd7l2"] Apr 20 10:01:39.214610 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:39.214562 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:39.214691 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:39.214669 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:40.152163 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:40.152127 2566 generic.go:358] "Generic (PLEG): container finished" podID="ed2f027e-e531-41a7-8185-1a14d9f86cb2" containerID="3a046af99812f2d21bf96841ceeefef650397a1855b40fbe579d443bfbbfa76b" exitCode=0 Apr 20 10:01:40.152659 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:40.152222 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9z29g" event={"ID":"ed2f027e-e531-41a7-8185-1a14d9f86cb2","Type":"ContainerDied","Data":"3a046af99812f2d21bf96841ceeefef650397a1855b40fbe579d443bfbbfa76b"} Apr 20 10:01:40.152659 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:40.152445 2566 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 10:01:40.542001 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:40.541954 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:01:40.855570 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:40.855538 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:40.855722 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:40.855580 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:40.855722 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:40.855661 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:40.855840 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:40.855753 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:40.855840 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:40.855815 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:40.855939 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:40.855900 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:42.168662 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:42.168610 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" podUID="580807b1-acaf-4082-b5c8-ab84f495b516" containerName="ovnkube-controller" probeResult="failure" output="" Apr 20 10:01:42.856437 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:42.856400 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:42.856615 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:42.856529 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:42.856615 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:42.856572 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:42.856733 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:42.856641 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:42.856733 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:42.856646 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:42.856733 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:42.856713 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:44.855107 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:44.854918 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:44.855569 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:44.854917 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:44.855569 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:44.855193 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd7l2" podUID="59a2e033-9cb8-4b1c-adf3-c0a5307d7e50" Apr 20 10:01:44.855569 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:44.854917 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:44.855569 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:44.855261 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhkq5" podUID="9b07cfc0-68ca-4db2-bd1d-22319ff081b1" Apr 20 10:01:44.855569 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:44.855394 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94htn" podUID="0b10ec74-afc5-4519-a053-44766e5a7624" Apr 20 10:01:45.190913 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.190841 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeReady" Apr 20 10:01:45.191075 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.190986 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 10:01:45.230015 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.229984 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7df4665cc4-bl69s"] Apr 20 10:01:45.260403 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.259766 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7df4665cc4-bl69s"] Apr 20 10:01:45.260403 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.259805 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qxrrr"] Apr 20 10:01:45.277467 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.277431 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qhjjc"] Apr 20 10:01:45.277651 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.277521 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.277651 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.277616 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:45.280528 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.280495 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 10:01:45.280665 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.280621 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 10:01:45.280744 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.280689 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4xmrs\"" Apr 20 10:01:45.280833 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.280819 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 10:01:45.280999 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.280983 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 10:01:45.281066 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.281031 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 10:01:45.281215 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.281195 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w6bmx\"" Apr 20 10:01:45.281371 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.281321 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 10:01:45.285894 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.285865 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 10:01:45.295877 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.295855 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qxrrr"] Apr 20 10:01:45.295877 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.295881 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qhjjc"] Apr 20 10:01:45.296010 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.295984 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.298597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.298576 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 10:01:45.298731 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.298662 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qvhbt\"" Apr 20 10:01:45.298731 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.298697 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 10:01:45.346733 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.346694 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-certificates\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.346897 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.346773 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-trusted-ca\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.346897 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.346811 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-installation-pull-secrets\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.347014 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.346900 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b5b3a62-5e63-40aa-8183-d8243b4b590c-ca-trust-extracted\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.347014 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.346927 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.347014 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.346955 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-bound-sa-token\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.347014 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.346987 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzv5q\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-kube-api-access-wzv5q\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.347172 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.347044 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-image-registry-private-configuration\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448000 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-certificates\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448040 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.448108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448062 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-trusted-ca\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448086 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-config-volume\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.448108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448110 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr7xh\" (UniqueName: \"kubernetes.io/projected/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-kube-api-access-sr7xh\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.448540 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448248 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-installation-pull-secrets\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448540 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448292 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:45.448540 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448360 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzqfn\" (UniqueName: \"kubernetes.io/projected/55c7ba10-8caa-41b9-bbae-94fed9831f88-kube-api-access-bzqfn\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:45.448540 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448511 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b5b3a62-5e63-40aa-8183-d8243b4b590c-ca-trust-extracted\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448712 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448546 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-tmp-dir\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.448712 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448577 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448712 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448596 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-bound-sa-token\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448712 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448622 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzv5q\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-kube-api-access-wzv5q\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448712 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448655 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-image-registry-private-configuration\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.448712 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.448692 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:45.448712 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.448708 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df4665cc4-bl69s: secret "image-registry-tls" not found Apr 20 10:01:45.449015 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.448765 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls podName:3b5b3a62-5e63-40aa-8183-d8243b4b590c nodeName:}" failed. No retries permitted until 2026-04-20 10:01:45.94874577 +0000 UTC m=+33.608408449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls") pod "image-registry-7df4665cc4-bl69s" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c") : secret "image-registry-tls" not found Apr 20 10:01:45.449015 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448808 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-certificates\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.449015 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.448857 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b5b3a62-5e63-40aa-8183-d8243b4b590c-ca-trust-extracted\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.449206 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.449181 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-trusted-ca\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.453095 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.453070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-installation-pull-secrets\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.453277 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.453070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-image-registry-private-configuration\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.459265 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.459228 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-bound-sa-token\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.459447 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.459419 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzv5q\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-kube-api-access-wzv5q\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.549358 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549298 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-config-volume\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.549358 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549361 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr7xh\" (UniqueName: \"kubernetes.io/projected/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-kube-api-access-sr7xh\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.549600 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549400 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:45.549600 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549429 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzqfn\" (UniqueName: \"kubernetes.io/projected/55c7ba10-8caa-41b9-bbae-94fed9831f88-kube-api-access-bzqfn\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:45.549600 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549464 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:45.549600 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549499 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-tmp-dir\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.549600 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.549577 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:45.549600 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.549593 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:45.549882 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.549645 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert podName:55c7ba10-8caa-41b9-bbae-94fed9831f88 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:46.049624339 +0000 UTC m=+33.709286999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert") pod "ingress-canary-qxrrr" (UID: "55c7ba10-8caa-41b9-bbae-94fed9831f88") : secret "canary-serving-cert" not found Apr 20 10:01:45.549882 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.549666 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs podName:9b07cfc0-68ca-4db2-bd1d-22319ff081b1 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:17.549656329 +0000 UTC m=+65.209318994 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs") pod "network-metrics-daemon-dhkq5" (UID: "9b07cfc0-68ca-4db2-bd1d-22319ff081b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:45.549882 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549705 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.549882 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.549806 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:45.549882 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549834 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-tmp-dir\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.549882 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.549850 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls podName:d9c07ac7-fc07-4127-8ab6-18d69cec95c8 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:46.049838912 +0000 UTC m=+33.709501571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls") pod "dns-default-qhjjc" (UID: "d9c07ac7-fc07-4127-8ab6-18d69cec95c8") : secret "dns-default-metrics-tls" not found Apr 20 10:01:45.550123 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.549975 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-config-volume\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.560363 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.560311 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr7xh\" (UniqueName: \"kubernetes.io/projected/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-kube-api-access-sr7xh\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:45.560532 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.560495 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzqfn\" (UniqueName: \"kubernetes.io/projected/55c7ba10-8caa-41b9-bbae-94fed9831f88-kube-api-access-bzqfn\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:45.650506 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.650454 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:45.650700 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.650629 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:45.650700 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.650656 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:45.650700 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.650672 2566 projected.go:194] Error preparing data for projected volume kube-api-access-p5jqj for pod openshift-network-diagnostics/network-check-target-kd7l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:45.650872 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.650746 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj podName:59a2e033-9cb8-4b1c-adf3-c0a5307d7e50 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:17.650727826 +0000 UTC m=+65.310390498 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5jqj" (UniqueName: "kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj") pod "network-check-target-kd7l2" (UID: "59a2e033-9cb8-4b1c-adf3-c0a5307d7e50") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:45.953304 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:45.953265 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:45.953687 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.953451 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:45.953687 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.953476 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df4665cc4-bl69s: secret "image-registry-tls" not found Apr 20 10:01:45.953687 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:45.953554 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls podName:3b5b3a62-5e63-40aa-8183-d8243b4b590c nodeName:}" failed. No retries permitted until 2026-04-20 10:01:46.953533419 +0000 UTC m=+34.613196078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls") pod "image-registry-7df4665cc4-bl69s" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c") : secret "image-registry-tls" not found Apr 20 10:01:46.054695 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.054658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:46.054695 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.054709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:46.054897 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:46.054812 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:46.054897 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:46.054874 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls podName:d9c07ac7-fc07-4127-8ab6-18d69cec95c8 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:47.054859522 +0000 UTC m=+34.714522186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls") pod "dns-default-qhjjc" (UID: "d9c07ac7-fc07-4127-8ab6-18d69cec95c8") : secret "dns-default-metrics-tls" not found Apr 20 10:01:46.054969 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:46.054812 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:46.054969 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:46.054940 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert podName:55c7ba10-8caa-41b9-bbae-94fed9831f88 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:47.054929554 +0000 UTC m=+34.714592227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert") pod "ingress-canary-qxrrr" (UID: "55c7ba10-8caa-41b9-bbae-94fed9831f88") : secret "canary-serving-cert" not found Apr 20 10:01:46.559179 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.559145 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9"] Apr 20 10:01:46.593633 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.584943 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg"] Apr 20 10:01:46.593633 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.585114 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" Apr 20 10:01:46.595619 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.595589 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-7xvhl\"" Apr 20 10:01:46.595785 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.595627 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 10:01:46.595785 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.595695 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 10:01:46.595785 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.595705 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 10:01:46.597008 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.596987 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 10:01:46.616387 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.616358 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9"] Apr 20 10:01:46.616387 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.616389 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg"] Apr 20 10:01:46.616591 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.616399 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7"] Apr 20 10:01:46.616591 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.616412 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.618384 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.618366 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 10:01:46.635448 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.635425 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7"] Apr 20 10:01:46.635583 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.635565 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.637645 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.637620 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 10:01:46.637737 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.637627 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 10:01:46.637806 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.637766 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 10:01:46.637806 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.637778 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 10:01:46.660295 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.660260 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c5d9f977-2969-42a8-a27f-9ef64b2c3dfc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8dd7b5565-2xzd9\" (UID: \"c5d9f977-2969-42a8-a27f-9ef64b2c3dfc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" Apr 20 10:01:46.660478 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.660310 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr2lp\" (UniqueName: \"kubernetes.io/projected/c5d9f977-2969-42a8-a27f-9ef64b2c3dfc-kube-api-access-lr2lp\") pod \"managed-serviceaccount-addon-agent-8dd7b5565-2xzd9\" (UID: \"c5d9f977-2969-42a8-a27f-9ef64b2c3dfc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" Apr 20 10:01:46.760918 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.760886 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-ca\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.761108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.760923 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-tmp\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.761108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761033 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c5d9f977-2969-42a8-a27f-9ef64b2c3dfc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8dd7b5565-2xzd9\" (UID: \"c5d9f977-2969-42a8-a27f-9ef64b2c3dfc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" Apr 20 10:01:46.761108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761069 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.761108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761096 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b5dh\" (UniqueName: \"kubernetes.io/projected/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-kube-api-access-5b5dh\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.761269 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761127 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-hub\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.761312 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761292 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lr2lp\" (UniqueName: \"kubernetes.io/projected/c5d9f977-2969-42a8-a27f-9ef64b2c3dfc-kube-api-access-lr2lp\") pod \"managed-serviceaccount-addon-agent-8dd7b5565-2xzd9\" (UID: \"c5d9f977-2969-42a8-a27f-9ef64b2c3dfc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" Apr 20 10:01:46.761343 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.761414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761368 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbq82\" (UniqueName: \"kubernetes.io/projected/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-kube-api-access-lbq82\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.761414 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761409 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.761490 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.761429 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-klusterlet-config\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.763753 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.763726 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c5d9f977-2969-42a8-a27f-9ef64b2c3dfc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8dd7b5565-2xzd9\" (UID: \"c5d9f977-2969-42a8-a27f-9ef64b2c3dfc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" Apr 20 10:01:46.768823 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.768803 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr2lp\" (UniqueName: \"kubernetes.io/projected/c5d9f977-2969-42a8-a27f-9ef64b2c3dfc-kube-api-access-lr2lp\") pod \"managed-serviceaccount-addon-agent-8dd7b5565-2xzd9\" (UID: \"c5d9f977-2969-42a8-a27f-9ef64b2c3dfc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" Apr 20 10:01:46.855539 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.855503 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:01:46.855681 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.855503 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:01:46.855718 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.855523 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:46.860127 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.860104 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 10:01:46.861918 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.861904 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 10:01:46.862031 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862012 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjrcf\"" Apr 20 10:01:46.862111 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862093 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.862170 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862114 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-plq86\"" Apr 20 10:01:46.862170 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbq82\" (UniqueName: \"kubernetes.io/projected/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-kube-api-access-lbq82\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.862170 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862161 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.862288 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862193 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-klusterlet-config\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.862337 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862290 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-ca\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.862337 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862308 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-tmp\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.862450 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862379 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 10:01:46.862571 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862380 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.862638 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862600 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b5dh\" (UniqueName: \"kubernetes.io/projected/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-kube-api-access-5b5dh\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.862693 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862647 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-hub\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.862791 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862768 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-tmp\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.862990 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.862942 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 10:01:46.863053 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.863019 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.864854 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.864832 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.864969 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.864910 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-ca\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.865022 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.864960 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-klusterlet-config\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.865125 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.865107 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-hub\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.865156 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.865117 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.870580 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.870558 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbq82\" (UniqueName: \"kubernetes.io/projected/1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c-kube-api-access-lbq82\") pod \"klusterlet-addon-workmgr-57888c7ff4-gqrbg\" (UID: \"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.871699 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.871677 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b5dh\" (UniqueName: \"kubernetes.io/projected/69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2-kube-api-access-5b5dh\") pod \"cluster-proxy-proxy-agent-65b867f9f7-jx4m7\" (UID: \"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.911908 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.911863 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" Apr 20 10:01:46.924741 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.924705 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:46.943515 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.943484 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:01:46.963456 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:46.963423 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:46.963970 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:46.963626 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:46.963970 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:46.963654 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df4665cc4-bl69s: secret "image-registry-tls" not found Apr 20 10:01:46.963970 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:46.963714 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls podName:3b5b3a62-5e63-40aa-8183-d8243b4b590c nodeName:}" failed. No retries permitted until 2026-04-20 10:01:48.963698531 +0000 UTC m=+36.623361195 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls") pod "image-registry-7df4665cc4-bl69s" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c") : secret "image-registry-tls" not found Apr 20 10:01:47.064664 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.064122 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:47.064664 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.064180 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:47.064664 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:47.064315 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:47.064664 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:47.064544 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:47.064664 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:47.064611 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls podName:d9c07ac7-fc07-4127-8ab6-18d69cec95c8 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:49.064590788 +0000 UTC m=+36.724253466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls") pod "dns-default-qhjjc" (UID: "d9c07ac7-fc07-4127-8ab6-18d69cec95c8") : secret "dns-default-metrics-tls" not found Apr 20 10:01:47.065220 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:47.065091 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert podName:55c7ba10-8caa-41b9-bbae-94fed9831f88 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:49.065062746 +0000 UTC m=+36.724725418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert") pod "ingress-canary-qxrrr" (UID: "55c7ba10-8caa-41b9-bbae-94fed9831f88") : secret "canary-serving-cert" not found Apr 20 10:01:47.120883 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.120840 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7"] Apr 20 10:01:47.125667 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.125551 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg"] Apr 20 10:01:47.126407 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.126208 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9"] Apr 20 10:01:47.127105 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:47.127084 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69cb98b9_86bf_4b0a_ab3b_fba6d6d789b2.slice/crio-b0052b0308292daed7bda185eba41e55be20d509a127320e321d295c608ea9ce WatchSource:0}: Error finding container b0052b0308292daed7bda185eba41e55be20d509a127320e321d295c608ea9ce: Status 404 returned error can't find the container with id b0052b0308292daed7bda185eba41e55be20d509a127320e321d295c608ea9ce Apr 20 10:01:47.128588 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:47.128561 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d17ccfe_4985_462d_b9cb_f86f7a3d5c3c.slice/crio-d9d78b276ea67259221161aa6cbf6401aaddc0e67574d210d8d545ca879bc15e WatchSource:0}: Error finding container d9d78b276ea67259221161aa6cbf6401aaddc0e67574d210d8d545ca879bc15e: Status 404 returned error can't find the container with id d9d78b276ea67259221161aa6cbf6401aaddc0e67574d210d8d545ca879bc15e Apr 20 10:01:47.129376 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:47.129281 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5d9f977_2969_42a8_a27f_9ef64b2c3dfc.slice/crio-08e572d946b858a5fc306a694a649309153454733e5dfcca15b65109594f98b8 WatchSource:0}: Error finding container 08e572d946b858a5fc306a694a649309153454733e5dfcca15b65109594f98b8: Status 404 returned error can't find the container with id 08e572d946b858a5fc306a694a649309153454733e5dfcca15b65109594f98b8 Apr 20 10:01:47.167114 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.167079 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" event={"ID":"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c","Type":"ContainerStarted","Data":"d9d78b276ea67259221161aa6cbf6401aaddc0e67574d210d8d545ca879bc15e"} Apr 20 10:01:47.168027 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.167998 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" event={"ID":"c5d9f977-2969-42a8-a27f-9ef64b2c3dfc","Type":"ContainerStarted","Data":"08e572d946b858a5fc306a694a649309153454733e5dfcca15b65109594f98b8"} Apr 20 10:01:47.170557 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.170520 2566 generic.go:358] "Generic (PLEG): container finished" podID="ed2f027e-e531-41a7-8185-1a14d9f86cb2" containerID="870ee14ad32af8263f384f2a1ec69a3e56e07b865199deb8a14cde2583e3a88d" exitCode=0 Apr 20 10:01:47.170693 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.170611 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9z29g" event={"ID":"ed2f027e-e531-41a7-8185-1a14d9f86cb2","Type":"ContainerDied","Data":"870ee14ad32af8263f384f2a1ec69a3e56e07b865199deb8a14cde2583e3a88d"} Apr 20 10:01:47.171654 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:47.171630 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" event={"ID":"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2","Type":"ContainerStarted","Data":"b0052b0308292daed7bda185eba41e55be20d509a127320e321d295c608ea9ce"} Apr 20 10:01:48.180289 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:48.180056 2566 generic.go:358] "Generic (PLEG): container finished" podID="ed2f027e-e531-41a7-8185-1a14d9f86cb2" containerID="41b724cb376d47e89f70cf28779a1a85c09e450b2018fabd8a699d09904429d8" exitCode=0 Apr 20 10:01:48.180933 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:48.180335 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9z29g" event={"ID":"ed2f027e-e531-41a7-8185-1a14d9f86cb2","Type":"ContainerDied","Data":"41b724cb376d47e89f70cf28779a1a85c09e450b2018fabd8a699d09904429d8"} Apr 20 10:01:48.979756 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:48.979658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:48.979933 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:48.979867 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:48.979933 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:48.979885 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df4665cc4-bl69s: secret "image-registry-tls" not found Apr 20 10:01:48.980047 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:48.979945 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls podName:3b5b3a62-5e63-40aa-8183-d8243b4b590c nodeName:}" failed. No retries permitted until 2026-04-20 10:01:52.979926602 +0000 UTC m=+40.639589263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls") pod "image-registry-7df4665cc4-bl69s" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c") : secret "image-registry-tls" not found Apr 20 10:01:49.080844 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:49.080804 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:49.081037 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:49.080871 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:49.081103 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:49.081056 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:49.081159 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:49.081115 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert podName:55c7ba10-8caa-41b9-bbae-94fed9831f88 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:53.08109716 +0000 UTC m=+40.740759823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert") pod "ingress-canary-qxrrr" (UID: "55c7ba10-8caa-41b9-bbae-94fed9831f88") : secret "canary-serving-cert" not found Apr 20 10:01:49.081690 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:49.081549 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:49.081690 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:49.081606 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls podName:d9c07ac7-fc07-4127-8ab6-18d69cec95c8 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:53.081589071 +0000 UTC m=+40.741251733 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls") pod "dns-default-qhjjc" (UID: "d9c07ac7-fc07-4127-8ab6-18d69cec95c8") : secret "dns-default-metrics-tls" not found Apr 20 10:01:49.188705 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:49.188638 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9z29g" event={"ID":"ed2f027e-e531-41a7-8185-1a14d9f86cb2","Type":"ContainerStarted","Data":"c6730caa7e2114120dadcd1734d1f093ad0387548f55270e582eb8a10681fd74"} Apr 20 10:01:49.216577 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:49.216257 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9z29g" podStartSLOduration=4.181125991 podStartE2EDuration="36.21623583s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:01:14.150930951 +0000 UTC m=+1.810593610" lastFinishedPulling="2026-04-20 10:01:46.186040777 +0000 UTC m=+33.845703449" observedRunningTime="2026-04-20 10:01:49.214682488 +0000 UTC m=+36.874345185" watchObservedRunningTime="2026-04-20 10:01:49.21623583 +0000 UTC m=+36.875898505" Apr 20 10:01:53.016563 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.016531 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:01:53.016959 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:53.016674 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:53.016959 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:53.016693 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df4665cc4-bl69s: secret "image-registry-tls" not found Apr 20 10:01:53.016959 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:53.016744 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls podName:3b5b3a62-5e63-40aa-8183-d8243b4b590c nodeName:}" failed. No retries permitted until 2026-04-20 10:02:01.016729192 +0000 UTC m=+48.676391857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls") pod "image-registry-7df4665cc4-bl69s" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c") : secret "image-registry-tls" not found Apr 20 10:01:53.117315 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.117217 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:01:53.117315 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.117264 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:01:53.117535 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:53.117392 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:53.117535 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:53.117448 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert podName:55c7ba10-8caa-41b9-bbae-94fed9831f88 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:01.117433554 +0000 UTC m=+48.777096214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert") pod "ingress-canary-qxrrr" (UID: "55c7ba10-8caa-41b9-bbae-94fed9831f88") : secret "canary-serving-cert" not found Apr 20 10:01:53.117535 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:53.117392 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:53.117535 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:01:53.117513 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls podName:d9c07ac7-fc07-4127-8ab6-18d69cec95c8 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:01.117501357 +0000 UTC m=+48.777164029 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls") pod "dns-default-qhjjc" (UID: "d9c07ac7-fc07-4127-8ab6-18d69cec95c8") : secret "dns-default-metrics-tls" not found Apr 20 10:01:53.198066 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.198028 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" event={"ID":"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2","Type":"ContainerStarted","Data":"21904be6c5d139315462d69ddfdea8f652171e21c888d13db54a47ebff1b356a"} Apr 20 10:01:53.199295 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.199259 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" event={"ID":"1d17ccfe-4985-462d-b9cb-f86f7a3d5c3c","Type":"ContainerStarted","Data":"c6a71e15b4a42ab8421a2c691763617ede80d87c36ee2744fa2f642613aa6fb2"} Apr 20 10:01:53.199486 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.199458 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:53.200738 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.200704 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" event={"ID":"c5d9f977-2969-42a8-a27f-9ef64b2c3dfc","Type":"ContainerStarted","Data":"d972e8da28cc6c7478bbbe178e82d4f617bfa01b3a2e2015d93bef7552010353"} Apr 20 10:01:53.201281 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.201261 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" Apr 20 10:01:53.216537 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.216483 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57888c7ff4-gqrbg" podStartSLOduration=1.473822394 podStartE2EDuration="7.216466495s" podCreationTimestamp="2026-04-20 10:01:46 +0000 UTC" firstStartedPulling="2026-04-20 10:01:47.130340409 +0000 UTC m=+34.790003082" lastFinishedPulling="2026-04-20 10:01:52.872984509 +0000 UTC m=+40.532647183" observedRunningTime="2026-04-20 10:01:53.216110883 +0000 UTC m=+40.875773564" watchObservedRunningTime="2026-04-20 10:01:53.216466495 +0000 UTC m=+40.876129177" Apr 20 10:01:53.235161 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:53.235112 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8dd7b5565-2xzd9" podStartSLOduration=1.507786663 podStartE2EDuration="7.23509776s" podCreationTimestamp="2026-04-20 10:01:46 +0000 UTC" firstStartedPulling="2026-04-20 10:01:47.131266064 +0000 UTC m=+34.790928723" lastFinishedPulling="2026-04-20 10:01:52.858577161 +0000 UTC m=+40.518239820" observedRunningTime="2026-04-20 10:01:53.235006106 +0000 UTC m=+40.894668799" watchObservedRunningTime="2026-04-20 10:01:53.23509776 +0000 UTC m=+40.894760441" Apr 20 10:01:54.734100 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:54.734058 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:54.737459 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:54.737428 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0b10ec74-afc5-4519-a053-44766e5a7624-original-pull-secret\") pod \"global-pull-secret-syncer-94htn\" (UID: \"0b10ec74-afc5-4519-a053-44766e5a7624\") " pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:54.990319 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:54.990282 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94htn" Apr 20 10:01:55.358635 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:55.358597 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-94htn"] Apr 20 10:01:55.362717 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:01:55.362687 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b10ec74_afc5_4519_a053_44766e5a7624.slice/crio-1f5a6f399a64dce2631897d8a13da9474647864d2c0cb986b96666e5588529bf WatchSource:0}: Error finding container 1f5a6f399a64dce2631897d8a13da9474647864d2c0cb986b96666e5588529bf: Status 404 returned error can't find the container with id 1f5a6f399a64dce2631897d8a13da9474647864d2c0cb986b96666e5588529bf Apr 20 10:01:56.208888 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:56.208850 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" event={"ID":"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2","Type":"ContainerStarted","Data":"ddbe4a906602483eb610f060aac25b44a26a41cabbae3252fd9cb3dde001d7d3"} Apr 20 10:01:56.208888 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:56.208893 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" event={"ID":"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2","Type":"ContainerStarted","Data":"04182c8af5d6f9187c3eb3107b5feed9fc3b3b8a49de097622025db89e11ebec"} Apr 20 10:01:56.210223 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:56.210192 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-94htn" event={"ID":"0b10ec74-afc5-4519-a053-44766e5a7624","Type":"ContainerStarted","Data":"1f5a6f399a64dce2631897d8a13da9474647864d2c0cb986b96666e5588529bf"} Apr 20 10:01:56.236143 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:01:56.236084 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" podStartSLOduration=2.122674868 podStartE2EDuration="10.236064955s" podCreationTimestamp="2026-04-20 10:01:46 +0000 UTC" firstStartedPulling="2026-04-20 10:01:47.129596135 +0000 UTC m=+34.789258794" lastFinishedPulling="2026-04-20 10:01:55.242986217 +0000 UTC m=+42.902648881" observedRunningTime="2026-04-20 10:01:56.2347291 +0000 UTC m=+43.894391781" watchObservedRunningTime="2026-04-20 10:01:56.236064955 +0000 UTC m=+43.895727637" Apr 20 10:02:00.221112 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:00.221074 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-94htn" event={"ID":"0b10ec74-afc5-4519-a053-44766e5a7624","Type":"ContainerStarted","Data":"6b53e3a9e7b7968291b92f0e1a5f3e7b34396893ba93d96c3da4c7c5cc601d7d"} Apr 20 10:02:00.239426 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:00.239372 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-94htn" podStartSLOduration=33.587498421 podStartE2EDuration="38.239338639s" podCreationTimestamp="2026-04-20 10:01:22 +0000 UTC" firstStartedPulling="2026-04-20 10:01:55.364740132 +0000 UTC m=+43.024402791" lastFinishedPulling="2026-04-20 10:02:00.01658035 +0000 UTC m=+47.676243009" observedRunningTime="2026-04-20 10:02:00.238721181 +0000 UTC m=+47.898383863" watchObservedRunningTime="2026-04-20 10:02:00.239338639 +0000 UTC m=+47.899001320" Apr 20 10:02:01.088025 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:01.087986 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:02:01.088201 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:01.088147 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:02:01.088201 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:01.088169 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df4665cc4-bl69s: secret "image-registry-tls" not found Apr 20 10:02:01.088391 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:01.088229 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls podName:3b5b3a62-5e63-40aa-8183-d8243b4b590c nodeName:}" failed. No retries permitted until 2026-04-20 10:02:17.088212904 +0000 UTC m=+64.747875562 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls") pod "image-registry-7df4665cc4-bl69s" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c") : secret "image-registry-tls" not found Apr 20 10:02:01.188778 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:01.188738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:02:01.188959 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:01.188838 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:02:01.188959 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:01.188887 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:02:01.188959 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:01.188922 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:02:01.188959 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:01.188951 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert podName:55c7ba10-8caa-41b9-bbae-94fed9831f88 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:17.188935766 +0000 UTC m=+64.848598430 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert") pod "ingress-canary-qxrrr" (UID: "55c7ba10-8caa-41b9-bbae-94fed9831f88") : secret "canary-serving-cert" not found Apr 20 10:02:01.189115 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:01.188967 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls podName:d9c07ac7-fc07-4127-8ab6-18d69cec95c8 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:17.188961272 +0000 UTC m=+64.848623930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls") pod "dns-default-qhjjc" (UID: "d9c07ac7-fc07-4127-8ab6-18d69cec95c8") : secret "dns-default-metrics-tls" not found Apr 20 10:02:01.656861 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:01.656830 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tjdkv_1bd60891-3cb2-4033-81b3-819a1fd45edd/dns-node-resolver/0.log" Apr 20 10:02:02.455921 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:02.455895 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jtlqx_cd566bd9-42a8-48e9-889b-d01eee4488c2/node-ca/0.log" Apr 20 10:02:12.167077 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:12.167043 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-69b6q" Apr 20 10:02:17.123015 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.122970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:02:17.125544 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.125516 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"image-registry-7df4665cc4-bl69s\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:02:17.224113 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.224074 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:02:17.224247 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.224134 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:02:17.226568 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.226538 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c07ac7-fc07-4127-8ab6-18d69cec95c8-metrics-tls\") pod \"dns-default-qhjjc\" (UID: \"d9c07ac7-fc07-4127-8ab6-18d69cec95c8\") " pod="openshift-dns/dns-default-qhjjc" Apr 20 10:02:17.226684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.226647 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55c7ba10-8caa-41b9-bbae-94fed9831f88-cert\") pod \"ingress-canary-qxrrr\" (UID: \"55c7ba10-8caa-41b9-bbae-94fed9831f88\") " pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:02:17.393575 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.393495 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4xmrs\"" Apr 20 10:02:17.400116 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.400088 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w6bmx\"" Apr 20 10:02:17.401314 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.401297 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qxrrr" Apr 20 10:02:17.408424 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.408399 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:02:17.408665 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.408644 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qvhbt\"" Apr 20 10:02:17.417119 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.417096 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qhjjc" Apr 20 10:02:17.538700 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.538670 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qxrrr"] Apr 20 10:02:17.542642 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:02:17.542612 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c7ba10_8caa_41b9_bbae_94fed9831f88.slice/crio-46f57644e2c8595c846f343c96e8565d88a1e1301e37ee83c9f46cd9594e8d29 WatchSource:0}: Error finding container 46f57644e2c8595c846f343c96e8565d88a1e1301e37ee83c9f46cd9594e8d29: Status 404 returned error can't find the container with id 46f57644e2c8595c846f343c96e8565d88a1e1301e37ee83c9f46cd9594e8d29 Apr 20 10:02:17.627289 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.627254 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:02:17.629707 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.629688 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 10:02:17.639905 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.639879 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b07cfc0-68ca-4db2-bd1d-22319ff081b1-metrics-certs\") pod \"network-metrics-daemon-dhkq5\" (UID: \"9b07cfc0-68ca-4db2-bd1d-22319ff081b1\") " pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:02:17.728432 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.728324 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:02:17.730802 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.730780 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 10:02:17.741571 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.741540 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 10:02:17.752696 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.752669 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jqj\" (UniqueName: \"kubernetes.io/projected/59a2e033-9cb8-4b1c-adf3-c0a5307d7e50-kube-api-access-p5jqj\") pod \"network-check-target-kd7l2\" (UID: \"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50\") " pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:02:17.762423 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.762393 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qhjjc"] Apr 20 10:02:17.765497 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:02:17.765466 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c07ac7_fc07_4127_8ab6_18d69cec95c8.slice/crio-df239deeb3ee72e4e87ffb7e13995514fb98a1b62d5447981a3c10cd8df48716 WatchSource:0}: Error finding container df239deeb3ee72e4e87ffb7e13995514fb98a1b62d5447981a3c10cd8df48716: Status 404 returned error can't find the container with id df239deeb3ee72e4e87ffb7e13995514fb98a1b62d5447981a3c10cd8df48716 Apr 20 10:02:17.766108 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.766088 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7df4665cc4-bl69s"] Apr 20 10:02:17.768596 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:02:17.768570 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b5b3a62_5e63_40aa_8183_d8243b4b590c.slice/crio-46198211d0eae2a929d8659a88f25b118ba2c57c44255680e52f95edf011ada3 WatchSource:0}: Error finding container 46198211d0eae2a929d8659a88f25b118ba2c57c44255680e52f95edf011ada3: Status 404 returned error can't find the container with id 46198211d0eae2a929d8659a88f25b118ba2c57c44255680e52f95edf011ada3 Apr 20 10:02:17.782559 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.782533 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjrcf\"" Apr 20 10:02:17.788759 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.788737 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-plq86\"" Apr 20 10:02:17.790834 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.790817 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhkq5" Apr 20 10:02:17.797299 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.797271 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:02:17.914796 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.914744 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dhkq5"] Apr 20 10:02:17.917534 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:02:17.917507 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b07cfc0_68ca_4db2_bd1d_22319ff081b1.slice/crio-d62bb9763eb6a5d3536dd7732479b673a50f7510be5210bfb5b3f1c88f1598d6 WatchSource:0}: Error finding container d62bb9763eb6a5d3536dd7732479b673a50f7510be5210bfb5b3f1c88f1598d6: Status 404 returned error can't find the container with id d62bb9763eb6a5d3536dd7732479b673a50f7510be5210bfb5b3f1c88f1598d6 Apr 20 10:02:17.936389 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:17.936339 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kd7l2"] Apr 20 10:02:17.953667 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:02:17.953632 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a2e033_9cb8_4b1c_adf3_c0a5307d7e50.slice/crio-0b88cdeaa00b8474a5bdc4696c89c37d7807564a051c8efff8cf8fd25d366a4f WatchSource:0}: Error finding container 0b88cdeaa00b8474a5bdc4696c89c37d7807564a051c8efff8cf8fd25d366a4f: Status 404 returned error can't find the container with id 0b88cdeaa00b8474a5bdc4696c89c37d7807564a051c8efff8cf8fd25d366a4f Apr 20 10:02:18.265742 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:18.265705 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qhjjc" event={"ID":"d9c07ac7-fc07-4127-8ab6-18d69cec95c8","Type":"ContainerStarted","Data":"df239deeb3ee72e4e87ffb7e13995514fb98a1b62d5447981a3c10cd8df48716"} Apr 20 10:02:18.267513 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:18.267479 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kd7l2" event={"ID":"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50","Type":"ContainerStarted","Data":"0b88cdeaa00b8474a5bdc4696c89c37d7807564a051c8efff8cf8fd25d366a4f"} Apr 20 10:02:18.268751 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:18.268723 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhkq5" event={"ID":"9b07cfc0-68ca-4db2-bd1d-22319ff081b1","Type":"ContainerStarted","Data":"d62bb9763eb6a5d3536dd7732479b673a50f7510be5210bfb5b3f1c88f1598d6"} Apr 20 10:02:18.270302 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:18.270275 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" event={"ID":"3b5b3a62-5e63-40aa-8183-d8243b4b590c","Type":"ContainerStarted","Data":"8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45"} Apr 20 10:02:18.270453 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:18.270304 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" event={"ID":"3b5b3a62-5e63-40aa-8183-d8243b4b590c","Type":"ContainerStarted","Data":"46198211d0eae2a929d8659a88f25b118ba2c57c44255680e52f95edf011ada3"} Apr 20 10:02:18.270525 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:18.270463 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:02:18.271513 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:18.271487 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qxrrr" event={"ID":"55c7ba10-8caa-41b9-bbae-94fed9831f88","Type":"ContainerStarted","Data":"46f57644e2c8595c846f343c96e8565d88a1e1301e37ee83c9f46cd9594e8d29"} Apr 20 10:02:20.295266 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.295206 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" podStartSLOduration=67.295185834 podStartE2EDuration="1m7.295185834s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:02:18.293803899 +0000 UTC m=+65.953466582" watchObservedRunningTime="2026-04-20 10:02:20.295185834 +0000 UTC m=+67.954848515" Apr 20 10:02:20.295698 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.295543 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-b5fs4"] Apr 20 10:02:20.300334 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.300309 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.302761 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.302735 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 10:02:20.338420 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.338231 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b5fs4"] Apr 20 10:02:20.338589 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.338476 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 10:02:20.338769 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.338751 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 10:02:20.338988 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.338972 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 10:02:20.339237 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.339222 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wtn5t\"" Apr 20 10:02:20.350635 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.350591 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.350823 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.350653 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.350823 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.350765 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-crio-socket\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.350823 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.350794 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9v64\" (UniqueName: \"kubernetes.io/projected/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-kube-api-access-n9v64\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.350961 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.350855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-data-volume\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.451402 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.451342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-data-volume\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.451613 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.451425 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.451613 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.451454 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.451613 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.451502 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-crio-socket\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.451613 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.451525 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9v64\" (UniqueName: \"kubernetes.io/projected/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-kube-api-access-n9v64\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.451805 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.451681 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-crio-socket\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.451805 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.451744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-data-volume\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.452210 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.452186 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.454848 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.454816 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.461910 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.461887 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9v64\" (UniqueName: \"kubernetes.io/projected/487f661d-51d8-4fe5-bd4d-40b9f9a67f05-kube-api-access-n9v64\") pod \"insights-runtime-extractor-b5fs4\" (UID: \"487f661d-51d8-4fe5-bd4d-40b9f9a67f05\") " pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:20.611029 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:20.610947 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b5fs4" Apr 20 10:02:21.207584 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.206920 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b5fs4"] Apr 20 10:02:21.216087 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:02:21.216054 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487f661d_51d8_4fe5_bd4d_40b9f9a67f05.slice/crio-4fca52bad81458e23fda7e877416f7a86e753a552279046ebcecbb9c570e47a7 WatchSource:0}: Error finding container 4fca52bad81458e23fda7e877416f7a86e753a552279046ebcecbb9c570e47a7: Status 404 returned error can't find the container with id 4fca52bad81458e23fda7e877416f7a86e753a552279046ebcecbb9c570e47a7 Apr 20 10:02:21.296952 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.296896 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhkq5" event={"ID":"9b07cfc0-68ca-4db2-bd1d-22319ff081b1","Type":"ContainerStarted","Data":"4ecd081bf5b5259b95d787db802a0337ac60a90d855e7720425b8bad96a8f4aa"} Apr 20 10:02:21.304885 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.304845 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qxrrr" event={"ID":"55c7ba10-8caa-41b9-bbae-94fed9831f88","Type":"ContainerStarted","Data":"140fd6fc52c0a461a35d26d58739d1a8b9646e9aca18d63d6ae3fef3b00fce00"} Apr 20 10:02:21.317430 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.315656 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qhjjc" event={"ID":"d9c07ac7-fc07-4127-8ab6-18d69cec95c8","Type":"ContainerStarted","Data":"db0b8c33a2830b7ee405908ac20248fc292b499d78d4c9ec217bee58f4f64eb8"} Apr 20 10:02:21.326235 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.326197 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kd7l2" event={"ID":"59a2e033-9cb8-4b1c-adf3-c0a5307d7e50","Type":"ContainerStarted","Data":"260e9b0d87044572dcef5fa5470bcc390889224a4ed6efb5e4fd5252d18926cb"} Apr 20 10:02:21.326823 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.326788 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:02:21.327615 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.327211 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qxrrr" podStartSLOduration=32.810628504 podStartE2EDuration="36.327194896s" podCreationTimestamp="2026-04-20 10:01:45 +0000 UTC" firstStartedPulling="2026-04-20 10:02:17.54431633 +0000 UTC m=+65.203978989" lastFinishedPulling="2026-04-20 10:02:21.060882717 +0000 UTC m=+68.720545381" observedRunningTime="2026-04-20 10:02:21.326025104 +0000 UTC m=+68.985687786" watchObservedRunningTime="2026-04-20 10:02:21.327194896 +0000 UTC m=+68.986857579" Apr 20 10:02:21.337212 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.336797 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b5fs4" event={"ID":"487f661d-51d8-4fe5-bd4d-40b9f9a67f05","Type":"ContainerStarted","Data":"64a2dacb97401f45350f60c0dc3133f94a7aa5562a578e3901fc2665fb4e0b88"} Apr 20 10:02:21.337212 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:21.336844 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b5fs4" event={"ID":"487f661d-51d8-4fe5-bd4d-40b9f9a67f05","Type":"ContainerStarted","Data":"4fca52bad81458e23fda7e877416f7a86e753a552279046ebcecbb9c570e47a7"} Apr 20 10:02:22.341360 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:22.341308 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b5fs4" event={"ID":"487f661d-51d8-4fe5-bd4d-40b9f9a67f05","Type":"ContainerStarted","Data":"417b0bdf7e2698ecbb8432d0252e4ceef004246f86e0e1ac03a1e30e89f24118"} Apr 20 10:02:22.342879 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:22.342850 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhkq5" event={"ID":"9b07cfc0-68ca-4db2-bd1d-22319ff081b1","Type":"ContainerStarted","Data":"e06e2ad4726b24b35ddfbeb3b9dfd2ed3d20697fb499b0bfb234405269680c29"} Apr 20 10:02:22.344391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:22.344360 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qhjjc" event={"ID":"d9c07ac7-fc07-4127-8ab6-18d69cec95c8","Type":"ContainerStarted","Data":"cfff49034845d76f18c0633c7dea01e8fd1763e1cf45527d751cf1b79d38e034"} Apr 20 10:02:22.344589 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:22.344577 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qhjjc" Apr 20 10:02:22.359231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:22.359180 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dhkq5" podStartSLOduration=66.214702027 podStartE2EDuration="1m9.3591648s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:02:17.91949328 +0000 UTC m=+65.579155939" lastFinishedPulling="2026-04-20 10:02:21.063956044 +0000 UTC m=+68.723618712" observedRunningTime="2026-04-20 10:02:22.358222657 +0000 UTC m=+70.017885339" watchObservedRunningTime="2026-04-20 10:02:22.3591648 +0000 UTC m=+70.018827480" Apr 20 10:02:22.359415 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:22.359260 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kd7l2" podStartSLOduration=66.192223168 podStartE2EDuration="1m9.359255694s" podCreationTimestamp="2026-04-20 10:01:13 +0000 UTC" firstStartedPulling="2026-04-20 10:02:17.956164024 +0000 UTC m=+65.615826682" lastFinishedPulling="2026-04-20 10:02:21.123196535 +0000 UTC m=+68.782859208" observedRunningTime="2026-04-20 10:02:21.348192265 +0000 UTC m=+69.007854958" watchObservedRunningTime="2026-04-20 10:02:22.359255694 +0000 UTC m=+70.018918376" Apr 20 10:02:22.375011 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:22.374954 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qhjjc" podStartSLOduration=34.080091234 podStartE2EDuration="37.37493564s" podCreationTimestamp="2026-04-20 10:01:45 +0000 UTC" firstStartedPulling="2026-04-20 10:02:17.767792514 +0000 UTC m=+65.427455172" lastFinishedPulling="2026-04-20 10:02:21.062636917 +0000 UTC m=+68.722299578" observedRunningTime="2026-04-20 10:02:22.373910915 +0000 UTC m=+70.033573588" watchObservedRunningTime="2026-04-20 10:02:22.37493564 +0000 UTC m=+70.034598319" Apr 20 10:02:24.351685 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:24.351641 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b5fs4" event={"ID":"487f661d-51d8-4fe5-bd4d-40b9f9a67f05","Type":"ContainerStarted","Data":"5fa2274fdf32241e498a895a63416746affabba2d30c6576cec9244d55b5bf35"} Apr 20 10:02:24.370470 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:24.370421 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-b5fs4" podStartSLOduration=2.350364247 podStartE2EDuration="4.370404341s" podCreationTimestamp="2026-04-20 10:02:20 +0000 UTC" firstStartedPulling="2026-04-20 10:02:21.27763184 +0000 UTC m=+68.937294511" lastFinishedPulling="2026-04-20 10:02:23.297671942 +0000 UTC m=+70.957334605" observedRunningTime="2026-04-20 10:02:24.369824848 +0000 UTC m=+72.029487542" watchObservedRunningTime="2026-04-20 10:02:24.370404341 +0000 UTC m=+72.030067022" Apr 20 10:02:32.348910 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:32.348799 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qhjjc" Apr 20 10:02:35.054461 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.054425 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gq8j4"] Apr 20 10:02:35.059242 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.059217 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.061684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.061655 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 10:02:35.061684 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.061674 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 10:02:35.062104 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.062085 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 10:02:35.062241 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.062193 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 10:02:35.062364 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.062322 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 10:02:35.062364 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.062327 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 10:02:35.062523 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.062421 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gq8bt\"" Apr 20 10:02:35.157990 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.157954 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-root\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.157990 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.157994 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-textfile\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.158230 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.158017 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-accelerators-collector-config\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.158230 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.158043 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-wtmp\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.158230 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.158148 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.158230 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.158190 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2xjb\" (UniqueName: \"kubernetes.io/projected/c59a8f97-78e5-4ba1-9567-dad2612f9fee-kube-api-access-z2xjb\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.158424 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.158252 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-sys\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.158424 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.158285 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-tls\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.158424 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.158369 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c59a8f97-78e5-4ba1-9567-dad2612f9fee-metrics-client-ca\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.258979 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.258941 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c59a8f97-78e5-4ba1-9567-dad2612f9fee-metrics-client-ca\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259151 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.258995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-root\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259151 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259018 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-textfile\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259151 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259042 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-accelerators-collector-config\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259151 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259073 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-wtmp\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259151 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259110 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259151 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259126 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-root\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259151 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259137 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2xjb\" (UniqueName: \"kubernetes.io/projected/c59a8f97-78e5-4ba1-9567-dad2612f9fee-kube-api-access-z2xjb\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259501 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259201 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-sys\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259501 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259236 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-tls\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259501 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259287 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-wtmp\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259501 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:35.259397 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 10:02:35.259501 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:02:35.259462 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-tls podName:c59a8f97-78e5-4ba1-9567-dad2612f9fee nodeName:}" failed. No retries permitted until 2026-04-20 10:02:35.759442065 +0000 UTC m=+83.419104729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-tls") pod "node-exporter-gq8j4" (UID: "c59a8f97-78e5-4ba1-9567-dad2612f9fee") : secret "node-exporter-tls" not found Apr 20 10:02:35.259703 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259511 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c59a8f97-78e5-4ba1-9567-dad2612f9fee-sys\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259703 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259612 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c59a8f97-78e5-4ba1-9567-dad2612f9fee-metrics-client-ca\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.259766 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.259708 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-accelerators-collector-config\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.260023 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.260000 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-textfile\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.262111 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.262087 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.268542 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.268514 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2xjb\" (UniqueName: \"kubernetes.io/projected/c59a8f97-78e5-4ba1-9567-dad2612f9fee-kube-api-access-z2xjb\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.764701 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.764666 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-tls\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.767011 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.766975 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c59a8f97-78e5-4ba1-9567-dad2612f9fee-node-exporter-tls\") pod \"node-exporter-gq8j4\" (UID: \"c59a8f97-78e5-4ba1-9567-dad2612f9fee\") " pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.973151 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:35.973117 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gq8j4" Apr 20 10:02:35.984243 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:02:35.984205 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc59a8f97_78e5_4ba1_9567_dad2612f9fee.slice/crio-e52369a021d9b87e1209f5312fb5b3ec7509f6ce54251aaff52907cbf0f92945 WatchSource:0}: Error finding container e52369a021d9b87e1209f5312fb5b3ec7509f6ce54251aaff52907cbf0f92945: Status 404 returned error can't find the container with id e52369a021d9b87e1209f5312fb5b3ec7509f6ce54251aaff52907cbf0f92945 Apr 20 10:02:36.383807 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:36.383769 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gq8j4" event={"ID":"c59a8f97-78e5-4ba1-9567-dad2612f9fee","Type":"ContainerStarted","Data":"e52369a021d9b87e1209f5312fb5b3ec7509f6ce54251aaff52907cbf0f92945"} Apr 20 10:02:37.388358 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:37.388317 2566 generic.go:358] "Generic (PLEG): container finished" podID="c59a8f97-78e5-4ba1-9567-dad2612f9fee" containerID="d1ed342ddefc0a093ea440378226a1a643a4ab9e2bc8532b949c2d39730631d8" exitCode=0 Apr 20 10:02:37.388782 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:37.388417 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gq8j4" event={"ID":"c59a8f97-78e5-4ba1-9567-dad2612f9fee","Type":"ContainerDied","Data":"d1ed342ddefc0a093ea440378226a1a643a4ab9e2bc8532b949c2d39730631d8"} Apr 20 10:02:38.395179 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:38.395148 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gq8j4" event={"ID":"c59a8f97-78e5-4ba1-9567-dad2612f9fee","Type":"ContainerStarted","Data":"7ea6342a515e52832cfc7b9c7647cf16f913c5f0cfa0382c14984467782c6663"} Apr 20 10:02:38.395179 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:38.395182 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gq8j4" event={"ID":"c59a8f97-78e5-4ba1-9567-dad2612f9fee","Type":"ContainerStarted","Data":"c1e8ceaea88e7fc5926823f9332f1d961ebd1e1485430ddbf558dd8b57b10636"} Apr 20 10:02:38.427595 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:38.427544 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gq8j4" podStartSLOduration=2.543898917 podStartE2EDuration="3.427529009s" podCreationTimestamp="2026-04-20 10:02:35 +0000 UTC" firstStartedPulling="2026-04-20 10:02:35.986280277 +0000 UTC m=+83.645942937" lastFinishedPulling="2026-04-20 10:02:36.869910368 +0000 UTC m=+84.529573029" observedRunningTime="2026-04-20 10:02:38.427289711 +0000 UTC m=+86.086952391" watchObservedRunningTime="2026-04-20 10:02:38.427529009 +0000 UTC m=+86.087191689" Apr 20 10:02:39.278593 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:39.278566 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:02:42.832645 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:42.832609 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7df4665cc4-bl69s"] Apr 20 10:02:53.349473 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:02:53.349443 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kd7l2" Apr 20 10:03:05.298568 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:05.298531 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qxrrr_55c7ba10-8caa-41b9-bbae-94fed9831f88/serve-healthcheck-canary/0.log" Apr 20 10:03:07.851547 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:07.851506 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" podUID="3b5b3a62-5e63-40aa-8183-d8243b4b590c" containerName="registry" containerID="cri-o://8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45" gracePeriod=30 Apr 20 10:03:08.089327 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.089304 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:03:08.099128 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099099 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzv5q\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-kube-api-access-wzv5q\") pod \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " Apr 20 10:03:08.099236 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099157 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-certificates\") pod \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " Apr 20 10:03:08.099236 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099181 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") pod \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " Apr 20 10:03:08.099236 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099199 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-bound-sa-token\") pod \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " Apr 20 10:03:08.099236 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099219 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b5b3a62-5e63-40aa-8183-d8243b4b590c-ca-trust-extracted\") pod \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " Apr 20 10:03:08.099437 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099243 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-image-registry-private-configuration\") pod \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " Apr 20 10:03:08.099437 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099281 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-installation-pull-secrets\") pod \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " Apr 20 10:03:08.099437 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099327 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-trusted-ca\") pod \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\" (UID: \"3b5b3a62-5e63-40aa-8183-d8243b4b590c\") " Apr 20 10:03:08.099761 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099723 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3b5b3a62-5e63-40aa-8183-d8243b4b590c" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:03:08.099929 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.099900 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3b5b3a62-5e63-40aa-8183-d8243b4b590c" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:03:08.101849 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.101744 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3b5b3a62-5e63-40aa-8183-d8243b4b590c" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:03:08.101960 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.101883 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3b5b3a62-5e63-40aa-8183-d8243b4b590c" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:08.101960 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.101938 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-kube-api-access-wzv5q" (OuterVolumeSpecName: "kube-api-access-wzv5q") pod "3b5b3a62-5e63-40aa-8183-d8243b4b590c" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c"). InnerVolumeSpecName "kube-api-access-wzv5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:03:08.102076 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.101963 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3b5b3a62-5e63-40aa-8183-d8243b4b590c" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:03:08.102076 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.101965 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3b5b3a62-5e63-40aa-8183-d8243b4b590c" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:08.108123 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.108095 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5b3a62-5e63-40aa-8183-d8243b4b590c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3b5b3a62-5e63-40aa-8183-d8243b4b590c" (UID: "3b5b3a62-5e63-40aa-8183-d8243b4b590c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:03:08.200214 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.200169 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b5b3a62-5e63-40aa-8183-d8243b4b590c-ca-trust-extracted\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:03:08.200214 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.200209 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-image-registry-private-configuration\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:03:08.200214 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.200220 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b5b3a62-5e63-40aa-8183-d8243b4b590c-installation-pull-secrets\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:03:08.200489 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.200232 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-trusted-ca\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:03:08.200489 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.200241 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wzv5q\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-kube-api-access-wzv5q\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:03:08.200489 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.200250 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-certificates\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:03:08.200489 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.200259 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-registry-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:03:08.200489 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.200268 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b5b3a62-5e63-40aa-8183-d8243b4b590c-bound-sa-token\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:03:08.475138 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.475039 2566 generic.go:358] "Generic (PLEG): container finished" podID="3b5b3a62-5e63-40aa-8183-d8243b4b590c" containerID="8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45" exitCode=0 Apr 20 10:03:08.475138 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.475087 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" event={"ID":"3b5b3a62-5e63-40aa-8183-d8243b4b590c","Type":"ContainerDied","Data":"8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45"} Apr 20 10:03:08.475138 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.475119 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" event={"ID":"3b5b3a62-5e63-40aa-8183-d8243b4b590c","Type":"ContainerDied","Data":"46198211d0eae2a929d8659a88f25b118ba2c57c44255680e52f95edf011ada3"} Apr 20 10:03:08.475138 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.475134 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df4665cc4-bl69s" Apr 20 10:03:08.475138 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.475140 2566 scope.go:117] "RemoveContainer" containerID="8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45" Apr 20 10:03:08.486935 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.486900 2566 scope.go:117] "RemoveContainer" containerID="8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45" Apr 20 10:03:08.487409 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:03:08.487328 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45\": container with ID starting with 8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45 not found: ID does not exist" containerID="8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45" Apr 20 10:03:08.487530 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.487392 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45"} err="failed to get container status \"8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45\": rpc error: code = NotFound desc = could not find container \"8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45\": container with ID starting with 8a5667ad4b75f01e1d3b0cf6801778a0185cef7ca0b008cb1dc87372b1a04f45 not found: ID does not exist" Apr 20 10:03:08.499090 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.499058 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7df4665cc4-bl69s"] Apr 20 10:03:08.502593 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.502552 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7df4665cc4-bl69s"] Apr 20 10:03:08.859647 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:08.859617 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5b3a62-5e63-40aa-8183-d8243b4b590c" path="/var/lib/kubelet/pods/3b5b3a62-5e63-40aa-8183-d8243b4b590c/volumes" Apr 20 10:03:26.945363 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:26.945317 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" podUID="69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 10:03:36.944651 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:36.944606 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" podUID="69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 10:03:46.944796 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:46.944746 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" podUID="69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 10:03:46.945257 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:46.944824 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" Apr 20 10:03:46.945319 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:46.945300 2566 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ddbe4a906602483eb610f060aac25b44a26a41cabbae3252fd9cb3dde001d7d3"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 10:03:46.945393 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:46.945340 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" podUID="69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2" containerName="service-proxy" containerID="cri-o://ddbe4a906602483eb610f060aac25b44a26a41cabbae3252fd9cb3dde001d7d3" gracePeriod=30 Apr 20 10:03:47.574438 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:47.574400 2566 generic.go:358] "Generic (PLEG): container finished" podID="69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2" containerID="ddbe4a906602483eb610f060aac25b44a26a41cabbae3252fd9cb3dde001d7d3" exitCode=2 Apr 20 10:03:47.574614 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:47.574478 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" event={"ID":"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2","Type":"ContainerDied","Data":"ddbe4a906602483eb610f060aac25b44a26a41cabbae3252fd9cb3dde001d7d3"} Apr 20 10:03:47.574614 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:03:47.574519 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65b867f9f7-jx4m7" event={"ID":"69cb98b9-86bf-4b0a-ab3b-fba6d6d789b2","Type":"ContainerStarted","Data":"a1b5d2f2338bf420aabb6864a8411e38f4105ca13fd549a8ed4a88fe882e9ef9"} Apr 20 10:06:12.789859 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:12.789831 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:06:12.790512 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:12.789948 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:06:12.799625 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:12.799600 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 10:06:58.101124 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.101087 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-jsx6r"] Apr 20 10:06:58.101564 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.101406 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b5b3a62-5e63-40aa-8183-d8243b4b590c" containerName="registry" Apr 20 10:06:58.101564 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.101419 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5b3a62-5e63-40aa-8183-d8243b4b590c" containerName="registry" Apr 20 10:06:58.101564 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.101468 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b5b3a62-5e63-40aa-8183-d8243b4b590c" containerName="registry" Apr 20 10:06:58.104028 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.104011 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" Apr 20 10:06:58.106142 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.106121 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 10:06:58.106569 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.106548 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-pkxt7\"" Apr 20 10:06:58.106679 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.106574 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 10:06:58.112293 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.112263 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-jsx6r"] Apr 20 10:06:58.185793 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.185751 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkptm\" (UniqueName: \"kubernetes.io/projected/25bd609b-0fdc-4ca1-a31b-ba193d0cebbe-kube-api-access-wkptm\") pod \"cert-manager-cainjector-8966b78d4-jsx6r\" (UID: \"25bd609b-0fdc-4ca1-a31b-ba193d0cebbe\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" Apr 20 10:06:58.185974 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.185802 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25bd609b-0fdc-4ca1-a31b-ba193d0cebbe-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-jsx6r\" (UID: \"25bd609b-0fdc-4ca1-a31b-ba193d0cebbe\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" Apr 20 10:06:58.286320 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.286280 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkptm\" (UniqueName: \"kubernetes.io/projected/25bd609b-0fdc-4ca1-a31b-ba193d0cebbe-kube-api-access-wkptm\") pod \"cert-manager-cainjector-8966b78d4-jsx6r\" (UID: \"25bd609b-0fdc-4ca1-a31b-ba193d0cebbe\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" Apr 20 10:06:58.286513 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.286392 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25bd609b-0fdc-4ca1-a31b-ba193d0cebbe-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-jsx6r\" (UID: \"25bd609b-0fdc-4ca1-a31b-ba193d0cebbe\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" Apr 20 10:06:58.294297 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.294271 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25bd609b-0fdc-4ca1-a31b-ba193d0cebbe-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-jsx6r\" (UID: \"25bd609b-0fdc-4ca1-a31b-ba193d0cebbe\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" Apr 20 10:06:58.294471 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.294304 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkptm\" (UniqueName: \"kubernetes.io/projected/25bd609b-0fdc-4ca1-a31b-ba193d0cebbe-kube-api-access-wkptm\") pod \"cert-manager-cainjector-8966b78d4-jsx6r\" (UID: \"25bd609b-0fdc-4ca1-a31b-ba193d0cebbe\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" Apr 20 10:06:58.413199 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.413113 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" Apr 20 10:06:58.548127 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.548089 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-jsx6r"] Apr 20 10:06:58.552560 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:06:58.552525 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd609b_0fdc_4ca1_a31b_ba193d0cebbe.slice/crio-77775c49bc75631f6d704ab23812f894c43d670014e3c2b1f60faf1f2776e067 WatchSource:0}: Error finding container 77775c49bc75631f6d704ab23812f894c43d670014e3c2b1f60faf1f2776e067: Status 404 returned error can't find the container with id 77775c49bc75631f6d704ab23812f894c43d670014e3c2b1f60faf1f2776e067 Apr 20 10:06:58.554307 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:58.554289 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:06:59.054335 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:06:59.054301 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" event={"ID":"25bd609b-0fdc-4ca1-a31b-ba193d0cebbe","Type":"ContainerStarted","Data":"77775c49bc75631f6d704ab23812f894c43d670014e3c2b1f60faf1f2776e067"} Apr 20 10:07:04.069943 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:04.069899 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" event={"ID":"25bd609b-0fdc-4ca1-a31b-ba193d0cebbe","Type":"ContainerStarted","Data":"6098abbb42d4c8d99d41458635bbcf67bd0025e91d92f92ffb85f949a8314394"} Apr 20 10:07:04.088907 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:04.088855 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-jsx6r" podStartSLOduration=1.273415079 podStartE2EDuration="6.088837556s" podCreationTimestamp="2026-04-20 10:06:58 +0000 UTC" firstStartedPulling="2026-04-20 10:06:58.554443033 +0000 UTC m=+346.214105692" lastFinishedPulling="2026-04-20 10:07:03.369865496 +0000 UTC m=+351.029528169" observedRunningTime="2026-04-20 10:07:04.088317678 +0000 UTC m=+351.747980357" watchObservedRunningTime="2026-04-20 10:07:04.088837556 +0000 UTC m=+351.748500219" Apr 20 10:07:57.334404 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.334364 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j"] Apr 20 10:07:57.337288 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.337269 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.340268 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.340243 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-qqqv2\"" Apr 20 10:07:57.340268 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.340242 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 20 10:07:57.340854 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.340827 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 20 10:07:57.340949 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.340837 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 10:07:57.340949 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.340872 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 10:07:57.349175 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.349148 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j"] Apr 20 10:07:57.412045 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.412013 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac73c45f-15b7-4308-a46e-c9d897d078d1-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.412252 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.412079 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/ac73c45f-15b7-4308-a46e-c9d897d078d1-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.412252 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.412133 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndbz\" (UniqueName: \"kubernetes.io/projected/ac73c45f-15b7-4308-a46e-c9d897d078d1-kube-api-access-tndbz\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.513569 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.513530 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/ac73c45f-15b7-4308-a46e-c9d897d078d1-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.513569 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.513568 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndbz\" (UniqueName: \"kubernetes.io/projected/ac73c45f-15b7-4308-a46e-c9d897d078d1-kube-api-access-tndbz\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.513793 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.513602 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac73c45f-15b7-4308-a46e-c9d897d078d1-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.514187 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.514165 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/ac73c45f-15b7-4308-a46e-c9d897d078d1-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.515932 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.515912 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac73c45f-15b7-4308-a46e-c9d897d078d1-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.525277 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.525247 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndbz\" (UniqueName: \"kubernetes.io/projected/ac73c45f-15b7-4308-a46e-c9d897d078d1-kube-api-access-tndbz\") pod \"kubeflow-trainer-controller-manager-55f5694779-lw68j\" (UID: \"ac73c45f-15b7-4308-a46e-c9d897d078d1\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.646313 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.646220 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:07:57.767768 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:57.767733 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j"] Apr 20 10:07:57.771036 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:07:57.771002 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac73c45f_15b7_4308_a46e_c9d897d078d1.slice/crio-5983b1cee702e6adf1e01f3411667d5a2ecded358ed4983830eb165bc1a47093 WatchSource:0}: Error finding container 5983b1cee702e6adf1e01f3411667d5a2ecded358ed4983830eb165bc1a47093: Status 404 returned error can't find the container with id 5983b1cee702e6adf1e01f3411667d5a2ecded358ed4983830eb165bc1a47093 Apr 20 10:07:58.218644 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:07:58.218606 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" event={"ID":"ac73c45f-15b7-4308-a46e-c9d897d078d1","Type":"ContainerStarted","Data":"5983b1cee702e6adf1e01f3411667d5a2ecded358ed4983830eb165bc1a47093"} Apr 20 10:08:01.228366 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:08:01.228305 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" event={"ID":"ac73c45f-15b7-4308-a46e-c9d897d078d1","Type":"ContainerStarted","Data":"66741afe4ea9f465336d106200ca8eba4ff818e3f8609e206e15b2346025b1b1"} Apr 20 10:08:01.228783 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:08:01.228546 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:08:01.245574 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:08:01.245525 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" podStartSLOduration=1.8116051469999999 podStartE2EDuration="4.245510269s" podCreationTimestamp="2026-04-20 10:07:57 +0000 UTC" firstStartedPulling="2026-04-20 10:07:57.772775133 +0000 UTC m=+405.432437792" lastFinishedPulling="2026-04-20 10:08:00.206680255 +0000 UTC m=+407.866342914" observedRunningTime="2026-04-20 10:08:01.244498318 +0000 UTC m=+408.904161003" watchObservedRunningTime="2026-04-20 10:08:01.245510269 +0000 UTC m=+408.905172950" Apr 20 10:08:17.236435 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:08:17.236405 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-lw68j" Apr 20 10:09:55.267094 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.267050 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr"] Apr 20 10:09:55.268920 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.268901 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" Apr 20 10:09:55.270914 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.270882 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-9br85\"/\"openshift-service-ca.crt\"" Apr 20 10:09:55.271275 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.271256 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-9br85\"/\"default-dockercfg-4kqdj\"" Apr 20 10:09:55.271393 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.271323 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-9br85\"/\"kube-root-ca.crt\"" Apr 20 10:09:55.277248 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.277219 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr"] Apr 20 10:09:55.343663 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.343612 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9rcb\" (UniqueName: \"kubernetes.io/projected/c1894698-cdbf-469d-9dcd-b5b67734bff9-kube-api-access-p9rcb\") pod \"test-trainjob-hs94f-node-0-0-b5cdr\" (UID: \"c1894698-cdbf-469d-9dcd-b5b67734bff9\") " pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" Apr 20 10:09:55.444945 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.444889 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9rcb\" (UniqueName: \"kubernetes.io/projected/c1894698-cdbf-469d-9dcd-b5b67734bff9-kube-api-access-p9rcb\") pod \"test-trainjob-hs94f-node-0-0-b5cdr\" (UID: \"c1894698-cdbf-469d-9dcd-b5b67734bff9\") " pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" Apr 20 10:09:55.453615 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.453585 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9rcb\" (UniqueName: \"kubernetes.io/projected/c1894698-cdbf-469d-9dcd-b5b67734bff9-kube-api-access-p9rcb\") pod \"test-trainjob-hs94f-node-0-0-b5cdr\" (UID: \"c1894698-cdbf-469d-9dcd-b5b67734bff9\") " pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" Apr 20 10:09:55.578389 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.578245 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" Apr 20 10:09:55.697078 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:55.697051 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr"] Apr 20 10:09:55.699749 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:09:55.699719 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1894698_cdbf_469d_9dcd_b5b67734bff9.slice/crio-a141222bbb08e3c6ee4a4f5bfe91d4fcb761d1c7d74cfb52915a9e84cd6e7526 WatchSource:0}: Error finding container a141222bbb08e3c6ee4a4f5bfe91d4fcb761d1c7d74cfb52915a9e84cd6e7526: Status 404 returned error can't find the container with id a141222bbb08e3c6ee4a4f5bfe91d4fcb761d1c7d74cfb52915a9e84cd6e7526 Apr 20 10:09:56.528252 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:09:56.528188 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" event={"ID":"c1894698-cdbf-469d-9dcd-b5b67734bff9","Type":"ContainerStarted","Data":"a141222bbb08e3c6ee4a4f5bfe91d4fcb761d1c7d74cfb52915a9e84cd6e7526"} Apr 20 10:11:12.812750 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:11:12.812721 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:11:12.813263 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:11:12.813069 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:14:11.265380 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:11.265329 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" event={"ID":"c1894698-cdbf-469d-9dcd-b5b67734bff9","Type":"ContainerStarted","Data":"b5abbb51d09549173c757b29d84c7262cc09e17391182e63081c462dfc9b188c"} Apr 20 10:14:11.293063 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:11.293010 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" podStartSLOduration=1.726354628 podStartE2EDuration="4m16.292990356s" podCreationTimestamp="2026-04-20 10:09:55 +0000 UTC" firstStartedPulling="2026-04-20 10:09:55.701791382 +0000 UTC m=+523.361454040" lastFinishedPulling="2026-04-20 10:14:10.268427092 +0000 UTC m=+777.928089768" observedRunningTime="2026-04-20 10:14:11.292251453 +0000 UTC m=+778.951914135" watchObservedRunningTime="2026-04-20 10:14:11.292990356 +0000 UTC m=+778.952653065" Apr 20 10:14:16.281457 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:16.281424 2566 generic.go:358] "Generic (PLEG): container finished" podID="c1894698-cdbf-469d-9dcd-b5b67734bff9" containerID="b5abbb51d09549173c757b29d84c7262cc09e17391182e63081c462dfc9b188c" exitCode=0 Apr 20 10:14:16.282035 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:16.281507 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" event={"ID":"c1894698-cdbf-469d-9dcd-b5b67734bff9","Type":"ContainerDied","Data":"b5abbb51d09549173c757b29d84c7262cc09e17391182e63081c462dfc9b188c"} Apr 20 10:14:17.419429 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:17.419404 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" Apr 20 10:14:17.523609 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:17.523567 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9rcb\" (UniqueName: \"kubernetes.io/projected/c1894698-cdbf-469d-9dcd-b5b67734bff9-kube-api-access-p9rcb\") pod \"c1894698-cdbf-469d-9dcd-b5b67734bff9\" (UID: \"c1894698-cdbf-469d-9dcd-b5b67734bff9\") " Apr 20 10:14:17.525757 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:17.525728 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1894698-cdbf-469d-9dcd-b5b67734bff9-kube-api-access-p9rcb" (OuterVolumeSpecName: "kube-api-access-p9rcb") pod "c1894698-cdbf-469d-9dcd-b5b67734bff9" (UID: "c1894698-cdbf-469d-9dcd-b5b67734bff9"). InnerVolumeSpecName "kube-api-access-p9rcb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:14:17.624613 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:17.624528 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9rcb\" (UniqueName: \"kubernetes.io/projected/c1894698-cdbf-469d-9dcd-b5b67734bff9-kube-api-access-p9rcb\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:14:18.289173 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.289142 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" event={"ID":"c1894698-cdbf-469d-9dcd-b5b67734bff9","Type":"ContainerDied","Data":"a141222bbb08e3c6ee4a4f5bfe91d4fcb761d1c7d74cfb52915a9e84cd6e7526"} Apr 20 10:14:18.289173 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.289176 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a141222bbb08e3c6ee4a4f5bfe91d4fcb761d1c7d74cfb52915a9e84cd6e7526" Apr 20 10:14:18.289388 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.289155 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr" Apr 20 10:14:18.707237 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.707156 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz"] Apr 20 10:14:18.707624 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.707417 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1894698-cdbf-469d-9dcd-b5b67734bff9" containerName="node" Apr 20 10:14:18.707624 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.707430 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1894698-cdbf-469d-9dcd-b5b67734bff9" containerName="node" Apr 20 10:14:18.707624 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.707469 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1894698-cdbf-469d-9dcd-b5b67734bff9" containerName="node" Apr 20 10:14:18.959301 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.959215 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" Apr 20 10:14:18.961703 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.961679 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-6dr82\"/\"openshift-service-ca.crt\"" Apr 20 10:14:18.961860 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.961681 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-6dr82\"/\"kube-root-ca.crt\"" Apr 20 10:14:18.961860 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.961689 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-6dr82\"/\"default-dockercfg-8wlgw\"" Apr 20 10:14:18.963088 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:18.963069 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz"] Apr 20 10:14:19.035924 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:19.035888 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbs2q\" (UniqueName: \"kubernetes.io/projected/6bdb93cf-a53a-4f2b-973c-65e91f623461-kube-api-access-pbs2q\") pod \"test-trainjob-h8448-node-0-0-wgskz\" (UID: \"6bdb93cf-a53a-4f2b-973c-65e91f623461\") " pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" Apr 20 10:14:19.136724 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:19.136690 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbs2q\" (UniqueName: \"kubernetes.io/projected/6bdb93cf-a53a-4f2b-973c-65e91f623461-kube-api-access-pbs2q\") pod \"test-trainjob-h8448-node-0-0-wgskz\" (UID: \"6bdb93cf-a53a-4f2b-973c-65e91f623461\") " pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" Apr 20 10:14:19.144851 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:19.144825 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbs2q\" (UniqueName: \"kubernetes.io/projected/6bdb93cf-a53a-4f2b-973c-65e91f623461-kube-api-access-pbs2q\") pod \"test-trainjob-h8448-node-0-0-wgskz\" (UID: \"6bdb93cf-a53a-4f2b-973c-65e91f623461\") " pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" Apr 20 10:14:19.268243 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:19.268207 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" Apr 20 10:14:19.392206 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:19.392157 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz"] Apr 20 10:14:19.394758 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:14:19.394728 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bdb93cf_a53a_4f2b_973c_65e91f623461.slice/crio-d4e04a0fd49a3aa80f415a85f52742cf325e882f54a032dc4a1eba8d685e3976 WatchSource:0}: Error finding container d4e04a0fd49a3aa80f415a85f52742cf325e882f54a032dc4a1eba8d685e3976: Status 404 returned error can't find the container with id d4e04a0fd49a3aa80f415a85f52742cf325e882f54a032dc4a1eba8d685e3976 Apr 20 10:14:19.396493 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:19.396477 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:14:20.297412 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:14:20.297371 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" event={"ID":"6bdb93cf-a53a-4f2b-973c-65e91f623461","Type":"ContainerStarted","Data":"d4e04a0fd49a3aa80f415a85f52742cf325e882f54a032dc4a1eba8d685e3976"} Apr 20 10:16:12.833333 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:16:12.833241 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:16:12.835603 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:16:12.835575 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:18:24.997681 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:24.997645 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" event={"ID":"6bdb93cf-a53a-4f2b-973c-65e91f623461","Type":"ContainerStarted","Data":"1af73e5379d49d1cca95130446dfd54aec3d2874f2dd8f96a51a4d0409d67381"} Apr 20 10:18:25.026908 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:25.026849 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" podStartSLOduration=1.627652884 podStartE2EDuration="4m7.026831464s" podCreationTimestamp="2026-04-20 10:14:18 +0000 UTC" firstStartedPulling="2026-04-20 10:14:19.396647659 +0000 UTC m=+787.056310317" lastFinishedPulling="2026-04-20 10:18:24.795826223 +0000 UTC m=+1032.455488897" observedRunningTime="2026-04-20 10:18:25.025529448 +0000 UTC m=+1032.685192128" watchObservedRunningTime="2026-04-20 10:18:25.026831464 +0000 UTC m=+1032.686494147" Apr 20 10:18:32.017301 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:32.017262 2566 generic.go:358] "Generic (PLEG): container finished" podID="6bdb93cf-a53a-4f2b-973c-65e91f623461" containerID="1af73e5379d49d1cca95130446dfd54aec3d2874f2dd8f96a51a4d0409d67381" exitCode=0 Apr 20 10:18:32.017764 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:32.017338 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" event={"ID":"6bdb93cf-a53a-4f2b-973c-65e91f623461","Type":"ContainerDied","Data":"1af73e5379d49d1cca95130446dfd54aec3d2874f2dd8f96a51a4d0409d67381"} Apr 20 10:18:33.360449 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:33.360419 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" Apr 20 10:18:33.456504 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:33.456470 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbs2q\" (UniqueName: \"kubernetes.io/projected/6bdb93cf-a53a-4f2b-973c-65e91f623461-kube-api-access-pbs2q\") pod \"6bdb93cf-a53a-4f2b-973c-65e91f623461\" (UID: \"6bdb93cf-a53a-4f2b-973c-65e91f623461\") " Apr 20 10:18:33.458647 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:33.458617 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdb93cf-a53a-4f2b-973c-65e91f623461-kube-api-access-pbs2q" (OuterVolumeSpecName: "kube-api-access-pbs2q") pod "6bdb93cf-a53a-4f2b-973c-65e91f623461" (UID: "6bdb93cf-a53a-4f2b-973c-65e91f623461"). InnerVolumeSpecName "kube-api-access-pbs2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:18:33.557499 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:33.557415 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pbs2q\" (UniqueName: \"kubernetes.io/projected/6bdb93cf-a53a-4f2b-973c-65e91f623461-kube-api-access-pbs2q\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:18:34.024082 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:34.024051 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" Apr 20 10:18:34.024267 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:34.024059 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz" event={"ID":"6bdb93cf-a53a-4f2b-973c-65e91f623461","Type":"ContainerDied","Data":"d4e04a0fd49a3aa80f415a85f52742cf325e882f54a032dc4a1eba8d685e3976"} Apr 20 10:18:34.024267 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:34.024162 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e04a0fd49a3aa80f415a85f52742cf325e882f54a032dc4a1eba8d685e3976" Apr 20 10:18:35.041470 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.041437 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk"] Apr 20 10:18:35.041874 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.041670 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bdb93cf-a53a-4f2b-973c-65e91f623461" containerName="node" Apr 20 10:18:35.041874 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.041681 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdb93cf-a53a-4f2b-973c-65e91f623461" containerName="node" Apr 20 10:18:35.041874 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.041721 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bdb93cf-a53a-4f2b-973c-65e91f623461" containerName="node" Apr 20 10:18:35.361846 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.361751 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk"] Apr 20 10:18:35.362015 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.361866 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" Apr 20 10:18:35.367125 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.367095 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-ctwg5\"/\"kube-root-ca.crt\"" Apr 20 10:18:35.367294 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.367226 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-ctwg5\"/\"openshift-service-ca.crt\"" Apr 20 10:18:35.367547 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.367530 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-ctwg5\"/\"default-dockercfg-jk577\"" Apr 20 10:18:35.470640 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.470596 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drrxj\" (UniqueName: \"kubernetes.io/projected/69a6f9ce-8501-411b-b66f-e98e3efae6a3-kube-api-access-drrxj\") pod \"test-trainjob-z456d-node-0-0-6hgjk\" (UID: \"69a6f9ce-8501-411b-b66f-e98e3efae6a3\") " pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" Apr 20 10:18:35.571646 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.571607 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drrxj\" (UniqueName: \"kubernetes.io/projected/69a6f9ce-8501-411b-b66f-e98e3efae6a3-kube-api-access-drrxj\") pod \"test-trainjob-z456d-node-0-0-6hgjk\" (UID: \"69a6f9ce-8501-411b-b66f-e98e3efae6a3\") " pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" Apr 20 10:18:35.580260 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.580227 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drrxj\" (UniqueName: \"kubernetes.io/projected/69a6f9ce-8501-411b-b66f-e98e3efae6a3-kube-api-access-drrxj\") pod \"test-trainjob-z456d-node-0-0-6hgjk\" (UID: \"69a6f9ce-8501-411b-b66f-e98e3efae6a3\") " pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" Apr 20 10:18:35.671077 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.670992 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" Apr 20 10:18:35.790916 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:35.790744 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk"] Apr 20 10:18:35.793514 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:18:35.793483 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69a6f9ce_8501_411b_b66f_e98e3efae6a3.slice/crio-55f3aaf246a31aef866bab8ff39d61fa1146db4ced9c1d14d8d950b6e96ca3d7 WatchSource:0}: Error finding container 55f3aaf246a31aef866bab8ff39d61fa1146db4ced9c1d14d8d950b6e96ca3d7: Status 404 returned error can't find the container with id 55f3aaf246a31aef866bab8ff39d61fa1146db4ced9c1d14d8d950b6e96ca3d7 Apr 20 10:18:36.030250 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:18:36.030212 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" event={"ID":"69a6f9ce-8501-411b-b66f-e98e3efae6a3","Type":"ContainerStarted","Data":"55f3aaf246a31aef866bab8ff39d61fa1146db4ced9c1d14d8d950b6e96ca3d7"} Apr 20 10:19:48.249751 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:48.249711 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" event={"ID":"69a6f9ce-8501-411b-b66f-e98e3efae6a3","Type":"ContainerStarted","Data":"0be625589a35e89f7fcf2f2cf7444fec61d92f717bfe03538db72c0f9e67f0d0"} Apr 20 10:19:48.267797 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:48.267722 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" podStartSLOduration=1.9080003749999999 podStartE2EDuration="1m13.267701949s" podCreationTimestamp="2026-04-20 10:18:35 +0000 UTC" firstStartedPulling="2026-04-20 10:18:35.795469573 +0000 UTC m=+1043.455132235" lastFinishedPulling="2026-04-20 10:19:47.15517115 +0000 UTC m=+1114.814833809" observedRunningTime="2026-04-20 10:19:48.265514337 +0000 UTC m=+1115.925177018" watchObservedRunningTime="2026-04-20 10:19:48.267701949 +0000 UTC m=+1115.927364650" Apr 20 10:19:51.261019 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:51.260981 2566 generic.go:358] "Generic (PLEG): container finished" podID="69a6f9ce-8501-411b-b66f-e98e3efae6a3" containerID="0be625589a35e89f7fcf2f2cf7444fec61d92f717bfe03538db72c0f9e67f0d0" exitCode=0 Apr 20 10:19:51.261490 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:51.261056 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" event={"ID":"69a6f9ce-8501-411b-b66f-e98e3efae6a3","Type":"ContainerDied","Data":"0be625589a35e89f7fcf2f2cf7444fec61d92f717bfe03538db72c0f9e67f0d0"} Apr 20 10:19:52.395505 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:52.395478 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" Apr 20 10:19:52.508739 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:52.508693 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drrxj\" (UniqueName: \"kubernetes.io/projected/69a6f9ce-8501-411b-b66f-e98e3efae6a3-kube-api-access-drrxj\") pod \"69a6f9ce-8501-411b-b66f-e98e3efae6a3\" (UID: \"69a6f9ce-8501-411b-b66f-e98e3efae6a3\") " Apr 20 10:19:52.510907 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:52.510876 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a6f9ce-8501-411b-b66f-e98e3efae6a3-kube-api-access-drrxj" (OuterVolumeSpecName: "kube-api-access-drrxj") pod "69a6f9ce-8501-411b-b66f-e98e3efae6a3" (UID: "69a6f9ce-8501-411b-b66f-e98e3efae6a3"). InnerVolumeSpecName "kube-api-access-drrxj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:19:52.609549 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:52.609460 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-drrxj\" (UniqueName: \"kubernetes.io/projected/69a6f9ce-8501-411b-b66f-e98e3efae6a3-kube-api-access-drrxj\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:19:53.267549 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:53.267520 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" Apr 20 10:19:53.267721 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:53.267517 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk" event={"ID":"69a6f9ce-8501-411b-b66f-e98e3efae6a3","Type":"ContainerDied","Data":"55f3aaf246a31aef866bab8ff39d61fa1146db4ced9c1d14d8d950b6e96ca3d7"} Apr 20 10:19:53.267721 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:19:53.267633 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55f3aaf246a31aef866bab8ff39d61fa1146db4ced9c1d14d8d950b6e96ca3d7" Apr 20 10:21:12.851243 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:21:12.851202 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:21:12.854276 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:21:12.854253 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:26:12.867644 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:12.867612 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:26:12.871517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:12.871497 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:26:31.601526 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.601440 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr"] Apr 20 10:26:31.601883 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.601683 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69a6f9ce-8501-411b-b66f-e98e3efae6a3" containerName="node" Apr 20 10:26:31.601883 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.601694 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a6f9ce-8501-411b-b66f-e98e3efae6a3" containerName="node" Apr 20 10:26:31.601883 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.601736 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="69a6f9ce-8501-411b-b66f-e98e3efae6a3" containerName="node" Apr 20 10:26:31.604294 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.604273 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" Apr 20 10:26:31.606539 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.606508 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-nslk9\"/\"kube-root-ca.crt\"" Apr 20 10:26:31.606664 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.606544 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-nslk9\"/\"openshift-service-ca.crt\"" Apr 20 10:26:31.606874 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.606859 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-nslk9\"/\"default-dockercfg-zmp4m\"" Apr 20 10:26:31.615830 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.615800 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr"] Apr 20 10:26:31.753426 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.753381 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc285\" (UniqueName: \"kubernetes.io/projected/23744d4a-893f-4c73-8929-e7ea90bd2232-kube-api-access-nc285\") pod \"test-trainjob-k75fw-node-0-0-d7fsr\" (UID: \"23744d4a-893f-4c73-8929-e7ea90bd2232\") " pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" Apr 20 10:26:31.853796 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.853701 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nc285\" (UniqueName: \"kubernetes.io/projected/23744d4a-893f-4c73-8929-e7ea90bd2232-kube-api-access-nc285\") pod \"test-trainjob-k75fw-node-0-0-d7fsr\" (UID: \"23744d4a-893f-4c73-8929-e7ea90bd2232\") " pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" Apr 20 10:26:31.862329 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.862299 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc285\" (UniqueName: \"kubernetes.io/projected/23744d4a-893f-4c73-8929-e7ea90bd2232-kube-api-access-nc285\") pod \"test-trainjob-k75fw-node-0-0-d7fsr\" (UID: \"23744d4a-893f-4c73-8929-e7ea90bd2232\") " pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" Apr 20 10:26:31.913274 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:31.913221 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" Apr 20 10:26:32.062969 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:32.062933 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr"] Apr 20 10:26:32.066100 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:26:32.066064 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23744d4a_893f_4c73_8929_e7ea90bd2232.slice/crio-a2bf86128f3918e087a89adbdfd8315d3f8ec6bc029b0c216070fda870c991ac WatchSource:0}: Error finding container a2bf86128f3918e087a89adbdfd8315d3f8ec6bc029b0c216070fda870c991ac: Status 404 returned error can't find the container with id a2bf86128f3918e087a89adbdfd8315d3f8ec6bc029b0c216070fda870c991ac Apr 20 10:26:32.068439 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:32.068421 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:26:32.339489 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:26:32.339448 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" event={"ID":"23744d4a-893f-4c73-8929-e7ea90bd2232","Type":"ContainerStarted","Data":"a2bf86128f3918e087a89adbdfd8315d3f8ec6bc029b0c216070fda870c991ac"} Apr 20 10:31:12.884422 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:31:12.884299 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:31:12.888507 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:31:12.888487 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:33:46.605924 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:33:46.605864 2566 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 20 10:33:46.606468 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:33:46.605935 2566 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 10:33:46.606468 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:33:46.605949 2566 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 10:35:16.940436 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:16.940404 2566 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 20 10:35:16.940962 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:16.940473 2566 container_gc.go:86] "Attempting to delete unused containers" Apr 20 10:35:16.941605 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:16.941586 2566 scope.go:117] "RemoveContainer" containerID="0be625589a35e89f7fcf2f2cf7444fec61d92f717bfe03538db72c0f9e67f0d0" Apr 20 10:35:23.227829 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:23.227789 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasDiskPressure" Apr 20 10:35:46.266384 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:46.265970 2566 scope.go:117] "RemoveContainer" containerID="1af73e5379d49d1cca95130446dfd54aec3d2874f2dd8f96a51a4d0409d67381" Apr 20 10:35:46.315117 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:46.315091 2566 scope.go:117] "RemoveContainer" containerID="b5abbb51d09549173c757b29d84c7262cc09e17391182e63081c462dfc9b188c" Apr 20 10:35:46.412174 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:46.412157 2566 image_gc_manager.go:447] "Attempting to delete unused images" Apr 20 10:35:46.423029 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:46.423005 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:35:46.426797 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:46.426769 2566 image_gc_manager.go:514] "Removing image to free bytes" imageID="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" size=1065600018 runtimeHandler="" Apr 20 10:35:46.818386 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:46.818326 2566 image_gc_manager.go:514] "Removing image to free bytes" imageID="ac4be6c7a52584c773ae754a4ccfb9fb1db440f4c9d858ad0f78765a85625b4b" size=1065006420 runtimeHandler="" Apr 20 10:35:48.688738 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:48.688699 2566 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 20 10:35:54.279406 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:35:54.279336 2566 image_gc_manager.go:514] "Removing image to free bytes" imageID="ad110250a85fcdba558f7f776c90e8eeba85487d69852b32b99f6e3e85c4336a" size=23201654703 runtimeHandler="" Apr 20 10:36:00.171710 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:00.171663 2566 image_gc_manager.go:514] "Removing image to free bytes" imageID="c56b70857a65504b2107e649861acc5cf43efd751e6a3e01a85cd7b6fa816db3" size=7588072889 runtimeHandler="" Apr 20 10:36:05.035862 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:05.035831 2566 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 20 10:36:06.029517 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:06.029479 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" event={"ID":"23744d4a-893f-4c73-8929-e7ea90bd2232","Type":"ContainerStarted","Data":"363c7fdbf62a13149c8ba1c48e9e6e52e5b68549d32dc7ec061353a7d380c1e7"} Apr 20 10:36:06.031539 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:06.031519 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-nslk9\"/\"default-dockercfg-zmp4m\"" Apr 20 10:36:06.056196 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:06.056141 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" podStartSLOduration=6.952761626 podStartE2EDuration="9m35.056124971s" podCreationTimestamp="2026-04-20 10:26:31 +0000 UTC" firstStartedPulling="2026-04-20 10:26:32.068550594 +0000 UTC m=+1519.728213252" lastFinishedPulling="2026-04-20 10:36:00.171913932 +0000 UTC m=+2087.831576597" observedRunningTime="2026-04-20 10:36:06.054721338 +0000 UTC m=+2093.714384017" watchObservedRunningTime="2026-04-20 10:36:06.056124971 +0000 UTC m=+2093.715787653" Apr 20 10:36:06.172742 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:06.172711 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-nslk9\"/\"kube-root-ca.crt\"" Apr 20 10:36:06.182754 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:06.182729 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-nslk9\"/\"openshift-service-ca.crt\"" Apr 20 10:36:12.902584 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:12.902466 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:36:12.945252 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:12.905001 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:36:27.085382 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:27.085331 2566 generic.go:358] "Generic (PLEG): container finished" podID="23744d4a-893f-4c73-8929-e7ea90bd2232" containerID="363c7fdbf62a13149c8ba1c48e9e6e52e5b68549d32dc7ec061353a7d380c1e7" exitCode=0 Apr 20 10:36:27.085810 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:27.085402 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" event={"ID":"23744d4a-893f-4c73-8929-e7ea90bd2232","Type":"ContainerDied","Data":"363c7fdbf62a13149c8ba1c48e9e6e52e5b68549d32dc7ec061353a7d380c1e7"} Apr 20 10:36:28.313906 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:28.313882 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" Apr 20 10:36:28.336872 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:28.336838 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc285\" (UniqueName: \"kubernetes.io/projected/23744d4a-893f-4c73-8929-e7ea90bd2232-kube-api-access-nc285\") pod \"23744d4a-893f-4c73-8929-e7ea90bd2232\" (UID: \"23744d4a-893f-4c73-8929-e7ea90bd2232\") " Apr 20 10:36:28.339305 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:28.339235 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23744d4a-893f-4c73-8929-e7ea90bd2232-kube-api-access-nc285" (OuterVolumeSpecName: "kube-api-access-nc285") pod "23744d4a-893f-4c73-8929-e7ea90bd2232" (UID: "23744d4a-893f-4c73-8929-e7ea90bd2232"). InnerVolumeSpecName "kube-api-access-nc285". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:36:28.437767 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:28.437730 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nc285\" (UniqueName: \"kubernetes.io/projected/23744d4a-893f-4c73-8929-e7ea90bd2232-kube-api-access-nc285\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 20 10:36:29.090813 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:29.090772 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" event={"ID":"23744d4a-893f-4c73-8929-e7ea90bd2232","Type":"ContainerDied","Data":"a2bf86128f3918e087a89adbdfd8315d3f8ec6bc029b0c216070fda870c991ac"} Apr 20 10:36:29.090813 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:29.090803 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr" Apr 20 10:36:29.090813 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:29.090814 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2bf86128f3918e087a89adbdfd8315d3f8ec6bc029b0c216070fda870c991ac" Apr 20 10:36:29.376444 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:29.376407 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-nslk9_test-trainjob-k75fw-node-0-0-d7fsr_23744d4a-893f-4c73-8929-e7ea90bd2232/node/0.log" Apr 20 10:36:29.563285 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:36:29.563251 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0be625589a35e89f7fcf2f2cf7444fec61d92f717bfe03538db72c0f9e67f0d0\": container with ID starting with 0be625589a35e89f7fcf2f2cf7444fec61d92f717bfe03538db72c0f9e67f0d0 not found: ID does not exist" containerID="0be625589a35e89f7fcf2f2cf7444fec61d92f717bfe03538db72c0f9e67f0d0" Apr 20 10:36:29.663558 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:36:29.663468 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af73e5379d49d1cca95130446dfd54aec3d2874f2dd8f96a51a4d0409d67381\": container with ID starting with 1af73e5379d49d1cca95130446dfd54aec3d2874f2dd8f96a51a4d0409d67381 not found: ID does not exist" containerID="1af73e5379d49d1cca95130446dfd54aec3d2874f2dd8f96a51a4d0409d67381" Apr 20 10:36:30.159588 ip-10-0-137-106 kubenswrapper[2566]: E0420 10:36:30.159556 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5abbb51d09549173c757b29d84c7262cc09e17391182e63081c462dfc9b188c\": container with ID starting with b5abbb51d09549173c757b29d84c7262cc09e17391182e63081c462dfc9b188c not found: ID does not exist" containerID="b5abbb51d09549173c757b29d84c7262cc09e17391182e63081c462dfc9b188c" Apr 20 10:36:34.407450 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.407418 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr"] Apr 20 10:36:34.411080 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.411054 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-nslk9/test-trainjob-k75fw-node-0-0-d7fsr"] Apr 20 10:36:34.608165 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.608131 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk"] Apr 20 10:36:34.614314 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.614281 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-ctwg5/test-trainjob-z456d-node-0-0-6hgjk"] Apr 20 10:36:34.775116 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.775082 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz"] Apr 20 10:36:34.776947 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.776918 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-6dr82/test-trainjob-h8448-node-0-0-wgskz"] Apr 20 10:36:34.859444 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.859411 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23744d4a-893f-4c73-8929-e7ea90bd2232" path="/var/lib/kubelet/pods/23744d4a-893f-4c73-8929-e7ea90bd2232/volumes" Apr 20 10:36:34.859738 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.859725 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a6f9ce-8501-411b-b66f-e98e3efae6a3" path="/var/lib/kubelet/pods/69a6f9ce-8501-411b-b66f-e98e3efae6a3/volumes" Apr 20 10:36:34.859989 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:34.859979 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdb93cf-a53a-4f2b-973c-65e91f623461" path="/var/lib/kubelet/pods/6bdb93cf-a53a-4f2b-973c-65e91f623461/volumes" Apr 20 10:36:35.376779 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:35.376742 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr"] Apr 20 10:36:35.381433 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:35.381397 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-9br85/test-trainjob-hs94f-node-0-0-b5cdr"] Apr 20 10:36:36.858587 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:36.858550 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1894698-cdbf-469d-9dcd-b5b67734bff9" path="/var/lib/kubelet/pods/c1894698-cdbf-469d-9dcd-b5b67734bff9/volumes" Apr 20 10:36:45.050668 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:45.050636 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-lw68j_ac73c45f-15b7-4308-a46e-c9d897d078d1/manager/0.log" Apr 20 10:36:45.470522 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:45.470441 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-lw68j_ac73c45f-15b7-4308-a46e-c9d897d078d1/manager/0.log" Apr 20 10:36:45.881670 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:36:45.881643 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-lw68j_ac73c45f-15b7-4308-a46e-c9d897d078d1/manager/0.log" Apr 20 10:37:20.954651 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.954561 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/must-gather-dg9nj"] Apr 20 10:37:20.955109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.954792 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23744d4a-893f-4c73-8929-e7ea90bd2232" containerName="node" Apr 20 10:37:20.955109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.954802 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="23744d4a-893f-4c73-8929-e7ea90bd2232" containerName="node" Apr 20 10:37:20.955109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.954868 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="23744d4a-893f-4c73-8929-e7ea90bd2232" containerName="node" Apr 20 10:37:20.957594 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.957574 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/must-gather-dg9nj" Apr 20 10:37:20.960597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.960569 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kqj2z\"/\"kube-root-ca.crt\"" Apr 20 10:37:20.961062 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.961042 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kqj2z\"/\"default-dockercfg-zn9bs\"" Apr 20 10:37:20.961153 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.961048 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kqj2z\"/\"openshift-service-ca.crt\"" Apr 20 10:37:20.971848 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:20.971818 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/must-gather-dg9nj"] Apr 20 10:37:21.016229 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.016182 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5rbn\" (UniqueName: \"kubernetes.io/projected/6cb9b808-cb39-44f1-a268-58eb99d9fecc-kube-api-access-k5rbn\") pod \"must-gather-dg9nj\" (UID: \"6cb9b808-cb39-44f1-a268-58eb99d9fecc\") " pod="openshift-must-gather-kqj2z/must-gather-dg9nj" Apr 20 10:37:21.016437 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.016253 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cb9b808-cb39-44f1-a268-58eb99d9fecc-must-gather-output\") pod \"must-gather-dg9nj\" (UID: \"6cb9b808-cb39-44f1-a268-58eb99d9fecc\") " pod="openshift-must-gather-kqj2z/must-gather-dg9nj" Apr 20 10:37:21.116652 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.116618 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cb9b808-cb39-44f1-a268-58eb99d9fecc-must-gather-output\") pod \"must-gather-dg9nj\" (UID: \"6cb9b808-cb39-44f1-a268-58eb99d9fecc\") " pod="openshift-must-gather-kqj2z/must-gather-dg9nj" Apr 20 10:37:21.116771 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.116668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5rbn\" (UniqueName: \"kubernetes.io/projected/6cb9b808-cb39-44f1-a268-58eb99d9fecc-kube-api-access-k5rbn\") pod \"must-gather-dg9nj\" (UID: \"6cb9b808-cb39-44f1-a268-58eb99d9fecc\") " pod="openshift-must-gather-kqj2z/must-gather-dg9nj" Apr 20 10:37:21.116993 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.116972 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cb9b808-cb39-44f1-a268-58eb99d9fecc-must-gather-output\") pod \"must-gather-dg9nj\" (UID: \"6cb9b808-cb39-44f1-a268-58eb99d9fecc\") " pod="openshift-must-gather-kqj2z/must-gather-dg9nj" Apr 20 10:37:21.125655 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.125620 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5rbn\" (UniqueName: \"kubernetes.io/projected/6cb9b808-cb39-44f1-a268-58eb99d9fecc-kube-api-access-k5rbn\") pod \"must-gather-dg9nj\" (UID: \"6cb9b808-cb39-44f1-a268-58eb99d9fecc\") " pod="openshift-must-gather-kqj2z/must-gather-dg9nj" Apr 20 10:37:21.267324 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.267293 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/must-gather-dg9nj" Apr 20 10:37:21.392621 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.392592 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/must-gather-dg9nj"] Apr 20 10:37:21.394649 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:37:21.394617 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cb9b808_cb39_44f1_a268_58eb99d9fecc.slice/crio-0b1b6ade4deb26625e226c9e3323c780b583a3903a5289aad89ef579382c6a59 WatchSource:0}: Error finding container 0b1b6ade4deb26625e226c9e3323c780b583a3903a5289aad89ef579382c6a59: Status 404 returned error can't find the container with id 0b1b6ade4deb26625e226c9e3323c780b583a3903a5289aad89ef579382c6a59 Apr 20 10:37:21.396462 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:21.396438 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:37:22.228163 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:22.228123 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/must-gather-dg9nj" event={"ID":"6cb9b808-cb39-44f1-a268-58eb99d9fecc","Type":"ContainerStarted","Data":"0b1b6ade4deb26625e226c9e3323c780b583a3903a5289aad89ef579382c6a59"} Apr 20 10:37:23.232839 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:23.232805 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/must-gather-dg9nj" event={"ID":"6cb9b808-cb39-44f1-a268-58eb99d9fecc","Type":"ContainerStarted","Data":"fd3f17dea75e6204ba9b0741086593b165b3fb5a30a9b69d48a80737b8ed320f"} Apr 20 10:37:23.232839 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:23.232844 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/must-gather-dg9nj" event={"ID":"6cb9b808-cb39-44f1-a268-58eb99d9fecc","Type":"ContainerStarted","Data":"4a31b834dbcec00c854471e874e8f2d45ac29ee8b17294b8dbd72dae08d684f3"} Apr 20 10:37:23.257533 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:23.257426 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kqj2z/must-gather-dg9nj" podStartSLOduration=2.399745596 podStartE2EDuration="3.257406513s" podCreationTimestamp="2026-04-20 10:37:20 +0000 UTC" firstStartedPulling="2026-04-20 10:37:21.39657894 +0000 UTC m=+2169.056241599" lastFinishedPulling="2026-04-20 10:37:22.254239853 +0000 UTC m=+2169.913902516" observedRunningTime="2026-04-20 10:37:23.252656087 +0000 UTC m=+2170.912318767" watchObservedRunningTime="2026-04-20 10:37:23.257406513 +0000 UTC m=+2170.917069197" Apr 20 10:37:24.162324 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:24.162288 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-94htn_0b10ec74-afc5-4519-a053-44766e5a7624/global-pull-secret-syncer/0.log" Apr 20 10:37:24.299053 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:24.299024 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6bwlp_b5da0aba-0c60-4b76-b8af-041b04e6fc2b/konnectivity-agent/0.log" Apr 20 10:37:24.400836 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:24.400795 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-106.ec2.internal_daadd8592f265af4bb30938041dae753/haproxy/0.log" Apr 20 10:37:27.678120 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:27.678091 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gq8j4_c59a8f97-78e5-4ba1-9567-dad2612f9fee/node-exporter/0.log" Apr 20 10:37:27.702203 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:27.702176 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gq8j4_c59a8f97-78e5-4ba1-9567-dad2612f9fee/kube-rbac-proxy/0.log" Apr 20 10:37:27.726995 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:27.726970 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gq8j4_c59a8f97-78e5-4ba1-9567-dad2612f9fee/init-textfile/0.log" Apr 20 10:37:30.813710 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:30.813644 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x"] Apr 20 10:37:30.814175 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:30.813948 2566 kubelet.go:2420] "Pod admission denied" podUID="1615d10c-21f4-4116-9a1d-3ab5a7155e2b" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:37:30.859105 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:30.859076 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x"] Apr 20 10:37:30.859262 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:30.859183 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" Apr 20 10:37:30.931537 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:30.931496 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x"] Apr 20 10:37:30.937456 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:30.937427 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x"] Apr 20 10:37:30.990134 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:30.990095 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2"] Apr 20 10:37:30.990299 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:30.990286 2566 kubelet.go:2420] "Pod admission denied" podUID="be876a45-63d8-4a15-b15e-9e0b3dd52072" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:37:31.012741 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.012702 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2"] Apr 20 10:37:31.012923 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.012841 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2" Apr 20 10:37:31.258673 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.258644 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" Apr 20 10:37:31.258853 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.258644 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2" Apr 20 10:37:31.262975 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.262949 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" Apr 20 10:37:31.266188 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.266166 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2" Apr 20 10:37:31.914068 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.914030 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2"] Apr 20 10:37:31.920627 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.920597 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2"] Apr 20 10:37:31.961433 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.961397 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk"] Apr 20 10:37:31.961916 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.961881 2566 kubelet.go:2420] "Pod admission denied" podUID="ba202777-a1c8-4c89-ad98-25b40e1b0078" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:37:31.974535 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.974495 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk"] Apr 20 10:37:31.974718 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:31.974628 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk" Apr 20 10:37:32.264530 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.264500 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk" Apr 20 10:37:32.264720 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.264501 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" Apr 20 10:37:32.264720 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.264516 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2" Apr 20 10:37:32.267693 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.267663 2566 status_manager.go:895] "Failed to get status for pod" podUID="1615d10c-21f4-4116-9a1d-3ab5a7155e2b" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" err="pods \"perf-node-gather-daemonset-k588x\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kqj2z\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:37:32.270561 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.270539 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk" Apr 20 10:37:32.271991 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.271962 2566 status_manager.go:895] "Failed to get status for pod" podUID="1615d10c-21f4-4116-9a1d-3ab5a7155e2b" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" err="pods \"perf-node-gather-daemonset-k588x\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kqj2z\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:37:32.274093 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.274060 2566 status_manager.go:895] "Failed to get status for pod" podUID="be876a45-63d8-4a15-b15e-9e0b3dd52072" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2" err="pods \"perf-node-gather-daemonset-xsqj2\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kqj2z\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:37:32.276121 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.276091 2566 status_manager.go:895] "Failed to get status for pod" podUID="be876a45-63d8-4a15-b15e-9e0b3dd52072" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2" err="pods \"perf-node-gather-daemonset-xsqj2\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kqj2z\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:37:32.278012 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.277982 2566 status_manager.go:895] "Failed to get status for pod" podUID="1615d10c-21f4-4116-9a1d-3ab5a7155e2b" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" err="pods \"perf-node-gather-daemonset-k588x\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kqj2z\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:37:32.566821 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.566741 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qhjjc_d9c07ac7-fc07-4127-8ab6-18d69cec95c8/dns/0.log" Apr 20 10:37:32.626697 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.626667 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qhjjc_d9c07ac7-fc07-4127-8ab6-18d69cec95c8/kube-rbac-proxy/0.log" Apr 20 10:37:32.714767 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.714732 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tjdkv_1bd60891-3cb2-4033-81b3-819a1fd45edd/dns-node-resolver/0.log" Apr 20 10:37:32.858572 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.858466 2566 status_manager.go:895] "Failed to get status for pod" podUID="1615d10c-21f4-4116-9a1d-3ab5a7155e2b" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-k588x" err="pods \"perf-node-gather-daemonset-k588x\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kqj2z\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:37:32.860075 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:32.860051 2566 status_manager.go:895] "Failed to get status for pod" podUID="be876a45-63d8-4a15-b15e-9e0b3dd52072" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-xsqj2" err="pods \"perf-node-gather-daemonset-xsqj2\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kqj2z\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:37:33.267121 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:33.267089 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk" Apr 20 10:37:33.320526 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:33.320496 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jtlqx_cd566bd9-42a8-48e9-889b-d01eee4488c2/node-ca/0.log" Apr 20 10:37:34.058196 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.058160 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk"] Apr 20 10:37:34.063037 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.063007 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-ptnxk"] Apr 20 10:37:34.112743 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.112708 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c"] Apr 20 10:37:34.112922 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.112906 2566 kubelet.go:2420] "Pod admission denied" podUID="fe2e0046-8e47-479c-a709-3eaf83ec5e75" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:37:34.118015 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.117982 2566 status_manager.go:895] "Failed to get status for pod" podUID="fe2e0046-8e47-479c-a709-3eaf83ec5e75" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c" err="pods \"perf-node-gather-daemonset-ncf7c\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kqj2z\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 20 10:37:34.137060 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.137024 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c"] Apr 20 10:37:34.137236 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.137161 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c" Apr 20 10:37:34.270613 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.270580 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c" Apr 20 10:37:34.277072 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.277047 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c" Apr 20 10:37:34.770985 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:34.770958 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qxrrr_55c7ba10-8caa-41b9-bbae-94fed9831f88/serve-healthcheck-canary/0.log" Apr 20 10:37:35.273577 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:35.273545 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c" Apr 20 10:37:35.378789 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:35.378759 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b5fs4_487f661d-51d8-4fe5-bd4d-40b9f9a67f05/kube-rbac-proxy/0.log" Apr 20 10:37:35.411541 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:35.411483 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b5fs4_487f661d-51d8-4fe5-bd4d-40b9f9a67f05/exporter/0.log" Apr 20 10:37:35.452157 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:35.452129 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b5fs4_487f661d-51d8-4fe5-bd4d-40b9f9a67f05/extractor/0.log" Apr 20 10:37:38.148799 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:38.148769 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c"] Apr 20 10:37:38.151654 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:38.151626 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-ncf7c"] Apr 20 10:37:38.182020 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:38.181984 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9"] Apr 20 10:37:38.182222 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:38.182201 2566 kubelet.go:2420] "Pod admission denied" podUID="30277d21-b959-4d33-8507-77f818a9ced8" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:37:38.197716 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:38.197679 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9"] Apr 20 10:37:38.197896 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:38.197808 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9" Apr 20 10:37:38.281078 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:38.281049 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9" Apr 20 10:37:38.286109 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:38.286084 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9" Apr 20 10:37:39.283277 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:39.283242 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9" Apr 20 10:37:42.202088 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.202060 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9z29g_ed2f027e-e531-41a7-8185-1a14d9f86cb2/kube-multus-additional-cni-plugins/0.log" Apr 20 10:37:42.227411 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.227372 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9z29g_ed2f027e-e531-41a7-8185-1a14d9f86cb2/egress-router-binary-copy/0.log" Apr 20 10:37:42.250469 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.250437 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9z29g_ed2f027e-e531-41a7-8185-1a14d9f86cb2/cni-plugins/0.log" Apr 20 10:37:42.273005 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.272976 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9z29g_ed2f027e-e531-41a7-8185-1a14d9f86cb2/bond-cni-plugin/0.log" Apr 20 10:37:42.294819 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.294789 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9z29g_ed2f027e-e531-41a7-8185-1a14d9f86cb2/routeoverride-cni/0.log" Apr 20 10:37:42.317409 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.317380 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9z29g_ed2f027e-e531-41a7-8185-1a14d9f86cb2/whereabouts-cni-bincopy/0.log" Apr 20 10:37:42.339728 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.339694 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9z29g_ed2f027e-e531-41a7-8185-1a14d9f86cb2/whereabouts-cni/0.log" Apr 20 10:37:42.598657 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.598621 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-smhx6_3e96c2c0-000c-46ed-b65a-85a4c7b0ea18/kube-multus/0.log" Apr 20 10:37:42.694212 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.694184 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dhkq5_9b07cfc0-68ca-4db2-bd1d-22319ff081b1/network-metrics-daemon/0.log" Apr 20 10:37:42.714298 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:42.714271 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dhkq5_9b07cfc0-68ca-4db2-bd1d-22319ff081b1/kube-rbac-proxy/0.log" Apr 20 10:37:43.526749 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.526717 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-controller/0.log" Apr 20 10:37:43.554901 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.554871 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:37:43.565737 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.565707 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/1.log" Apr 20 10:37:43.591201 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.591170 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/kube-rbac-proxy-node/0.log" Apr 20 10:37:43.618904 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.618876 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 10:37:43.658070 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.658042 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/northd/0.log" Apr 20 10:37:43.743542 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.743519 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/nbdb/0.log" Apr 20 10:37:43.771818 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.771794 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/sbdb/0.log" Apr 20 10:37:43.893180 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:43.893102 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovnkube-controller/0.log" Apr 20 10:37:45.652390 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:45.652335 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kd7l2_59a2e033-9cb8-4b1c-adf3-c0a5307d7e50/network-check-target-container/0.log" Apr 20 10:37:46.586335 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:46.586305 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wqpqq_afb0fd43-aa3d-4c27-aea8-0ac8c7dbd6bf/iptables-alerter/0.log" Apr 20 10:37:47.325420 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:47.325383 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9"] Apr 20 10:37:47.326615 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:47.326584 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nhf2t_eb2dd1bf-0a1d-42f6-8271-5ccafdf2c9ad/tuned/0.log" Apr 20 10:37:47.332844 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:47.332812 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-m4rb9"] Apr 20 10:37:47.366545 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:47.366510 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj"] Apr 20 10:37:47.366709 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:47.366698 2566 kubelet.go:2420] "Pod admission denied" podUID="94edaed2-8ec3-414a-9016-e6b4756709ed" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:37:47.378630 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:47.378595 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj"] Apr 20 10:37:47.378791 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:47.378726 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj" Apr 20 10:37:48.309950 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:48.309919 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj" Apr 20 10:37:48.315943 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:48.315910 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj" Apr 20 10:37:49.312984 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:49.312951 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj" Apr 20 10:37:51.528004 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:51.527975 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-76tbd_3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7/csi-driver/0.log" Apr 20 10:37:51.580279 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:51.580251 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-76tbd_3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7/csi-node-driver-registrar/0.log" Apr 20 10:37:51.669670 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:37:51.669640 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-76tbd_3b29027d-94f9-4f50-ad5f-5c3c34aa1bf7/csi-liveness-probe/0.log" Apr 20 10:38:03.409965 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:03.409928 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj"] Apr 20 10:38:03.424894 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:03.424859 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-vvfdj"] Apr 20 10:38:03.468463 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:03.468430 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq"] Apr 20 10:38:03.468643 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:03.468626 2566 kubelet.go:2420] "Pod admission denied" podUID="df474128-a182-409c-8224-132e60312161" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:38:03.488468 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:03.488431 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq"] Apr 20 10:38:03.488652 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:03.488543 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq" Apr 20 10:38:04.351121 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:04.351089 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq" Apr 20 10:38:04.355391 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:04.355364 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq" Apr 20 10:38:05.353064 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:05.353031 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq" Apr 20 10:38:35.529752 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:35.529661 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq"] Apr 20 10:38:35.533910 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:35.533873 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-427dq"] Apr 20 10:38:35.558001 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:35.557963 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94"] Apr 20 10:38:35.558195 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:35.558157 2566 kubelet.go:2420] "Pod admission denied" podUID="13ecd411-5419-4793-840a-25fc6cebb8db" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:38:35.567845 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:35.567813 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94"] Apr 20 10:38:35.568024 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:35.567946 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94" Apr 20 10:38:36.436863 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:36.436829 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94" Apr 20 10:38:36.441219 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:36.441197 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94" Apr 20 10:38:37.438825 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:38:37.438797 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94" Apr 20 10:39:39.588926 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:39.588886 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94"] Apr 20 10:39:39.592322 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:39.592293 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-4lk94"] Apr 20 10:39:39.615886 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:39.615855 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5"] Apr 20 10:39:39.616063 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:39.616046 2566 kubelet.go:2420] "Pod admission denied" podUID="9dd78a36-1182-44f9-b507-73754dd42138" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 20 10:39:39.626059 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:39.626020 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5"] Apr 20 10:39:39.626207 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:39.626148 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5" Apr 20 10:39:40.617897 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:40.617868 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5" Apr 20 10:39:40.622409 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:40.622385 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5" Apr 20 10:39:41.620599 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:39:41.620565 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5" Apr 20 10:40:28.727222 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:40:28.727132 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:41:12.918833 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:12.918705 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:41:12.926660 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:12.926638 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-69b6q_580807b1-acaf-4082-b5c8-ab84f495b516/ovn-acl-logging/0.log" Apr 20 10:41:47.648901 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.648815 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5"] Apr 20 10:41:47.651898 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.651867 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-pp2z5"] Apr 20 10:41:47.678022 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.677990 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9"] Apr 20 10:41:47.681074 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.681056 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.696956 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.696924 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9"] Apr 20 10:41:47.795983 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.795944 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-lib-modules\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.796161 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.795991 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-proc\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.796161 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.796065 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-podres\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.796161 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.796086 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-sys\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.796161 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.796127 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchsr\" (UniqueName: \"kubernetes.io/projected/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-kube-api-access-wchsr\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897419 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897377 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-proc\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897435 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-podres\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-sys\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897490 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wchsr\" (UniqueName: \"kubernetes.io/projected/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-kube-api-access-wchsr\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897523 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-proc\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897558 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-podres\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897570 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-sys\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897597 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897592 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-lib-modules\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.897886 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.897686 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-lib-modules\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.906167 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.906100 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchsr\" (UniqueName: \"kubernetes.io/projected/9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6-kube-api-access-wchsr\") pod \"perf-node-gather-daemonset-f82w9\" (UID: \"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:47.990754 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:47.990713 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:48.112665 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:48.112630 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9"] Apr 20 10:41:48.115868 ip-10-0-137-106 kubenswrapper[2566]: W0420 10:41:48.115833 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9eadc3b1_fb8a_49f1_ab55_2c74a984f1c6.slice/crio-b5146c396f32d204e9abdf983e3d81ce3d35e92bd46bcb9d11eb5fe300cffb5b WatchSource:0}: Error finding container b5146c396f32d204e9abdf983e3d81ce3d35e92bd46bcb9d11eb5fe300cffb5b: Status 404 returned error can't find the container with id b5146c396f32d204e9abdf983e3d81ce3d35e92bd46bcb9d11eb5fe300cffb5b Apr 20 10:41:48.963231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:48.963192 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" event={"ID":"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6","Type":"ContainerStarted","Data":"00b87763095215064f5448f75288dc333be2a6a1837f3ad8fa4d7549e1555c9a"} Apr 20 10:41:48.963231 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:48.963231 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" event={"ID":"9eadc3b1-fb8a-49f1-ab55-2c74a984f1c6","Type":"ContainerStarted","Data":"b5146c396f32d204e9abdf983e3d81ce3d35e92bd46bcb9d11eb5fe300cffb5b"} Apr 20 10:41:48.963796 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:48.963305 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" Apr 20 10:41:48.979090 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:48.979028 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9" podStartSLOduration=1.979014466 podStartE2EDuration="1.979014466s" podCreationTimestamp="2026-04-20 10:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:41:48.978888457 +0000 UTC m=+2436.638551139" watchObservedRunningTime="2026-04-20 10:41:48.979014466 +0000 UTC m=+2436.638677146" Apr 20 10:41:54.977368 ip-10-0-137-106 kubenswrapper[2566]: I0420 10:41:54.977319 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-f82w9"