Apr 17 11:13:31.292167 ip-10-0-135-188 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:13:31.292183 ip-10-0-135-188 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:13:31.292191 ip-10-0-135-188 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:13:31.292480 ip-10-0-135-188 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:13:41.498677 ip-10-0-135-188 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:13:41.498693 ip-10-0-135-188 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e6ff160bc63a426f968f0ce1c9f28d28 -- Apr 17 11:16:16.121473 ip-10-0-135-188 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:16.522091 ip-10-0-135-188 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:16.522091 ip-10-0-135-188 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:16.522091 ip-10-0-135-188 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:16.522091 ip-10-0-135-188 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:16.522091 ip-10-0-135-188 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:16.524724 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.524630 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:16.530347 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530322 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:16.530347 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530343 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:16.530347 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530346 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:16.530347 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530351 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:16.530347 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530354 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530357 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530360 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530363 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530367 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530369 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530372 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530375 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530377 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530380 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530383 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530385 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530388 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530391 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530393 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530396 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530399 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530401 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530404 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:16.530547 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530406 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530414 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530419 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530422 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530425 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530428 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530430 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530433 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530436 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530439 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530441 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530444 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530446 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530449 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530451 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530455 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530459 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530462 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530464 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530467 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:16.530998 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530470 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530473 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530477 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530479 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530482 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530485 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530488 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530491 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530493 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530496 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530498 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530501 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530504 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530506 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530509 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530512 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530515 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530517 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530520 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530523 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:16.531534 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530525 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530528 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530530 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530533 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530536 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530538 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530540 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530544 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530547 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530550 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530553 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530556 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530558 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530561 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530566 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530570 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530573 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530576 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530579 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:16.532012 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530582 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:16.532504 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530585 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:16.532504 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530588 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:16.532504 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.530590 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:16.532910 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532898 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532911 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532916 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532919 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532922 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532926 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532929 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532932 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532935 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532938 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532941 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:16.532939 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532944 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532947 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532950 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532953 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532955 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532958 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532960 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532963 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532965 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532968 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532970 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532972 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532975 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532978 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532981 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532983 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532986 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532988 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532991 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532993 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:16.533223 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532995 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.532998 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533001 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533003 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533007 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533009 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533013 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533016 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533019 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533022 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533024 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533027 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533030 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533032 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533036 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533038 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533041 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533044 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533046 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:16.533724 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533049 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533051 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533054 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533056 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533059 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533061 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533063 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533067 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533069 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533072 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533074 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533077 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533079 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533082 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533084 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533087 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533090 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533092 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533095 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533097 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:16.534214 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533100 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533102 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533105 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533108 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533110 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533113 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533115 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533118 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533121 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533123 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533126 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533128 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533145 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533148 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533150 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533153 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533228 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533236 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533243 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533248 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533254 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:16.534736 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533257 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533261 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533266 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533269 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533272 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533276 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533280 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533283 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533286 2580 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533289 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533292 2580 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533295 2580 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533298 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533301 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533305 2580 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533308 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533311 2580 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533314 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533322 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533326 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533329 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533333 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533336 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533339 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:16.535288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533342 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533345 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533349 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533351 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533356 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533359 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533361 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533364 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533368 2580 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533371 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533379 2580 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533382 2580 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533385 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533388 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533391 2580 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533395 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533398 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533401 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533404 2580 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533407 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533409 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533412 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533415 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533418 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533421 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:16.535860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533424 2580 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533429 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533432 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533436 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533439 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533442 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533445 2580 flags.go:64] FLAG: --help="false" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533448 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533451 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533454 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533458 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533461 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533465 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533468 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533470 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533473 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533476 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533479 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533482 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533485 2580 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533488 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533491 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533494 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533497 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:16.536481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533499 2580 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533502 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533505 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533508 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533513 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533516 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533519 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533521 2580 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533524 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533529 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533532 2580 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533535 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533539 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533542 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533547 2580 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533549 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533552 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533555 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533558 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533561 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533564 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533567 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533575 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533578 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533581 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:16.537056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533584 2580 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533587 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533593 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533596 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533599 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533602 2580 flags.go:64] FLAG: --port="10250" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533605 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533608 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0befdedf46d2c0c82" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533611 2580 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533614 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533617 2580 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533620 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533623 2580 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533627 2580 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533630 2580 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533633 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533637 2580 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533641 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533644 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533647 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533650 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533653 2580 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533656 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533659 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533662 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:16.537688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533665 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533668 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533671 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533674 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533677 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533679 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533682 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533685 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533688 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533692 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533694 2580 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533697 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533703 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533705 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533708 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533712 2580 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533715 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533718 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533721 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533724 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533727 2580 flags.go:64] FLAG: --v="2" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533731 2580 flags.go:64] FLAG: --version="false" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533736 2580 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533741 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.533745 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:16.538308 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533847 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533850 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533854 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533856 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533859 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533862 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533864 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533867 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533869 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533872 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533874 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533877 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533879 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533882 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533885 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533887 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533890 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533892 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533895 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533897 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:16.538912 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533900 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533902 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533905 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533907 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533910 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533912 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533915 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533917 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533920 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533922 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533925 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533928 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533931 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533934 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533936 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533939 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533943 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533947 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533950 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:16.539440 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533953 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533955 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533958 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533961 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533963 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533966 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533969 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533972 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533974 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533976 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533980 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533982 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533985 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533987 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533989 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533992 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533994 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.533997 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534000 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534002 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:16.539950 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534004 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534007 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534010 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534014 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534016 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534019 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534021 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534024 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534027 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534029 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534032 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534034 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534037 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534039 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534042 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534044 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534047 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534049 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534052 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534054 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:16.540495 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534058 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:16.541024 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534061 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:16.541024 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534064 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:16.541024 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534067 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:16.541024 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534069 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:16.541024 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534072 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:16.541024 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.534074 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:16.541024 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.534082 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:16.542048 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.542028 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:16.542082 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.542050 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:16.542118 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542101 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:16.542118 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542107 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:16.542118 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542111 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:16.542118 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542115 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:16.542118 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542118 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:16.542118 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542121 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542124 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542127 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542147 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542152 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542156 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542160 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542164 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542167 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542169 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542172 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542175 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542178 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542180 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542184 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542187 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542190 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542192 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542195 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542197 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:16.542290 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542200 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542203 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542205 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542208 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542211 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542214 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542217 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542220 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542223 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542225 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542229 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542231 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542234 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542236 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542239 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542242 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542245 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542248 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542251 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542254 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:16.542786 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542256 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542260 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542265 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542267 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542270 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542273 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542276 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542279 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542281 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542284 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542287 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542289 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542292 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542295 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542297 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542300 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542303 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542306 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542308 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:16.543369 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542311 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542314 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542316 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542319 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542321 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542323 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542326 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542330 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542332 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542335 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542337 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542340 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542343 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542345 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542348 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542351 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542354 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542357 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542359 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:16.543828 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542363 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542366 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542369 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.542375 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542476 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542481 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542484 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542487 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542490 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542493 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542496 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542499 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542502 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542504 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542507 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:16.544325 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542510 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542513 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542515 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542518 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542521 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542524 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542527 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542530 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542532 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542535 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542538 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542540 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542542 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542545 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542547 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542550 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542553 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542555 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542558 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542560 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:16.544705 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542562 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542565 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542567 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542570 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542573 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542577 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542580 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542583 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542586 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542589 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542591 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542595 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542599 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542602 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542604 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542607 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542610 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542612 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542615 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:16.545212 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542618 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542621 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542623 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542626 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542628 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542631 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542633 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542636 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542638 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542640 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542643 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542646 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542648 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542650 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542653 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542656 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542658 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542660 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542663 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542666 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:16.545665 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542668 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542671 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542673 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542675 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542678 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542680 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542683 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542686 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542688 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542691 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542694 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542696 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542699 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542701 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542704 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:16.542707 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:16.546164 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.542711 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:16.546570 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.543504 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:16.546570 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.546346 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:16.547236 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.547220 2580 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:16.547345 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.547327 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:16.547398 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.547372 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:16.568924 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.568899 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:16.573561 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.573543 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:16.586268 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.586242 2580 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:16.592190 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.592171 2580 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:16.594267 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.594248 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:16.597417 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.597399 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:16.598521 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.598502 2580 fs.go:135] Filesystem UUIDs: map[4c951819-f0d3-4aee-a036-53106aec7dc0:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a551e536-8b27-4a64-a64f-d73bebcb764f:/dev/nvme0n1p3] Apr 17 11:16:16.598579 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.598522 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:16.604234 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.604107 2580 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:16.602384389 +0000 UTC m=+0.372259538 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099887 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23ef164756136ef4c68d0ee2be7dd9 SystemUUID:ec23ef16-4756-136e-f4c6-8d0ee2be7dd9 BootID:e6ff160b-c63a-426f-968f-0ce1c9f28d28 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bc:ad:7e:75:b1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bc:ad:7e:75:b1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:62:0b:cb:03:a2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:16.604234 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.604227 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:16.604370 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.604356 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:16.605541 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.605516 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:16.605687 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.605545 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-188.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:16.605733 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.605696 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:16.605733 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.605706 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:16.605733 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.605723 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:16.606480 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.606470 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:16.607635 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.607626 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:16.607739 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.607731 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:16.609918 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.609907 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:16.609957 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.609927 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:16.609957 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.609941 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:16.609957 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.609950 2580 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:16.610038 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.609962 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:16.611118 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.611103 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:16.611168 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.611153 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:16.613916 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.613898 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:16.616003 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.615987 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:16.617947 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.617929 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:16.618005 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.617962 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:16.618005 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.617976 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:16.618005 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.617987 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:16.618005 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.617999 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:16.618120 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.618012 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:16.618120 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.618025 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:16.618120 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.618037 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:16.618120 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.618051 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:16.618120 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.618064 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:16.618120 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.618094 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:16.618120 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.618114 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:16.619054 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.619042 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:16.619054 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.619052 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:16.621664 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.621627 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:16.621963 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.621913 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-188.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:16.622917 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.622904 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:16.622976 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.622943 2580 server.go:1295] "Started kubelet" Apr 17 11:16:16.623040 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.623009 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:16.623127 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.623063 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:16.623185 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.623156 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:16.623753 ip-10-0-135-188 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:16.626501 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.626477 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:16.629090 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.629068 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:16.630000 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.629983 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gqktf" Apr 17 11:16:16.632097 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.632075 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-188.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:16.632984 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.632107 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-188.ec2.internal.18a720bd93e1f9c8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-188.ec2.internal,UID:ip-10-0-135-188.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-188.ec2.internal,},FirstTimestamp:2026-04-17 11:16:16.622918088 +0000 UTC m=+0.392793239,LastTimestamp:2026-04-17 11:16:16.622918088 +0000 UTC m=+0.392793239,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-188.ec2.internal,}" Apr 17 11:16:16.635509 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.635428 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:16.635509 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.635486 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:16.636251 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636231 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:16.636554 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.636514 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:16.636641 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636551 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:16.636691 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636660 2580 factory.go:55] Registering systemd factory Apr 17 11:16:16.636691 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636670 2580 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:16.636781 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636750 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:16.636781 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636766 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:16.636781 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.636770 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:16.636910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636847 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:16.636910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636855 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:16.636981 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636917 2580 factory.go:153] Registering CRI-O factory Apr 17 11:16:16.636981 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636932 2580 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:16.636981 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636954 2580 factory.go:103] Registering Raw factory Apr 17 11:16:16.636981 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.636967 2580 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:16.637419 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.637406 2580 manager.go:319] Starting recovery of all containers Apr 17 11:16:16.638402 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.638376 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gqktf" Apr 17 11:16:16.640503 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.640479 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-188.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 11:16:16.640590 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.640564 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 11:16:16.650197 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.650179 2580 manager.go:324] Recovery completed Apr 17 11:16:16.654303 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.654287 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:16.657006 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.656989 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:16.657079 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.657019 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:16.657079 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.657029 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:16.657600 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.657587 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:16.657667 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.657600 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:16.657667 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.657618 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:16.660399 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.660387 2580 policy_none.go:49] "None policy: Start" Apr 17 11:16:16.660435 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.660405 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:16.660435 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.660417 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:16.700777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.700762 2580 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.700794 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.700805 2580 server.go:85] "Starting device plugin registration server" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.701060 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.701072 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.701173 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.701256 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.701265 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.701906 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:16.714391 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.701938 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:16.777286 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.777209 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:16.778572 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.778556 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:16.778627 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.778585 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:16.778627 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.778618 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:16.778711 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.778628 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:16.778756 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.778733 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:16.781536 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.781515 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:16.801874 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.801843 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:16.802688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.802672 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:16.802766 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.802705 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:16.802766 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.802718 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:16.802766 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.802742 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.811948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.811930 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.812002 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.811952 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-188.ec2.internal\": node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:16.825759 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.825737 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:16.878841 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.878809 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal"] Apr 17 11:16:16.878991 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.878886 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:16.879844 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.879825 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:16.879933 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.879859 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:16.879933 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.879870 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:16.881090 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881079 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:16.881244 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881230 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.881303 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881258 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:16.881857 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881840 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:16.881931 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881871 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:16.881931 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881881 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:16.881931 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881848 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:16.881931 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881925 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:16.882086 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.881939 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:16.883074 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.883056 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.883173 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.883080 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:16.883730 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.883712 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:16.883813 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.883744 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:16.883813 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.883760 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:16.907794 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.907771 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-188.ec2.internal\" not found" node="ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.912127 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.912112 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-188.ec2.internal\" not found" node="ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.926776 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:16.926739 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:16.938926 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.938904 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ac675744bd931602b0121ca520dff9a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal\" (UID: \"ac675744bd931602b0121ca520dff9a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.938999 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.938930 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac675744bd931602b0121ca520dff9a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal\" (UID: \"ac675744bd931602b0121ca520dff9a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:16.938999 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:16.938949 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0fb21237899613074192be5ee06d9825-config\") pod \"kube-apiserver-proxy-ip-10-0-135-188.ec2.internal\" (UID: \"0fb21237899613074192be5ee06d9825\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.026844 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.026807 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.039528 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.039472 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ac675744bd931602b0121ca520dff9a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal\" (UID: \"ac675744bd931602b0121ca520dff9a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.039528 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.039503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac675744bd931602b0121ca520dff9a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal\" (UID: \"ac675744bd931602b0121ca520dff9a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.039528 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.039521 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0fb21237899613074192be5ee06d9825-config\") pod \"kube-apiserver-proxy-ip-10-0-135-188.ec2.internal\" (UID: \"0fb21237899613074192be5ee06d9825\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.039663 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.039559 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0fb21237899613074192be5ee06d9825-config\") pod \"kube-apiserver-proxy-ip-10-0-135-188.ec2.internal\" (UID: \"0fb21237899613074192be5ee06d9825\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.039663 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.039564 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac675744bd931602b0121ca520dff9a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal\" (UID: \"ac675744bd931602b0121ca520dff9a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.039663 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.039563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ac675744bd931602b0121ca520dff9a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal\" (UID: \"ac675744bd931602b0121ca520dff9a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.127591 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.127557 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.210091 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.210057 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.215030 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.215009 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" Apr 17 11:16:17.228121 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.228092 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.328741 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.328638 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.429225 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.429181 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.498198 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.497440 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:17.529340 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.529309 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.546805 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.546777 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:17.546934 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.546915 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:17.546996 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.546964 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:17.630397 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.630365 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.636348 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.636321 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:17.640524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.640466 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:16 +0000 UTC" deadline="2027-10-21 10:31:44.842617268 +0000 UTC" Apr 17 11:16:17.640524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.640523 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13247h15m27.202098806s" Apr 17 11:16:17.655092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.655071 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:17.675456 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.675430 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-k6p9w" Apr 17 11:16:17.683287 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.683267 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-k6p9w" Apr 17 11:16:17.687499 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:17.687471 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac675744bd931602b0121ca520dff9a8.slice/crio-07766f779fc7f21752339905b47809a2d02408532504774507d9d7a9f0e0212a WatchSource:0}: Error finding container 07766f779fc7f21752339905b47809a2d02408532504774507d9d7a9f0e0212a: Status 404 returned error can't find the container with id 07766f779fc7f21752339905b47809a2d02408532504774507d9d7a9f0e0212a Apr 17 11:16:17.687881 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:17.687861 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fb21237899613074192be5ee06d9825.slice/crio-18c3e06f84f9e906aebf2f43450101d4e9a5db24b7a9a7324a398754a88ee0a7 WatchSource:0}: Error finding container 18c3e06f84f9e906aebf2f43450101d4e9a5db24b7a9a7324a398754a88ee0a7: Status 404 returned error can't find the container with id 18c3e06f84f9e906aebf2f43450101d4e9a5db24b7a9a7324a398754a88ee0a7 Apr 17 11:16:17.691535 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.691519 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:17.730926 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.730885 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.782404 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.782341 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" event={"ID":"ac675744bd931602b0121ca520dff9a8","Type":"ContainerStarted","Data":"07766f779fc7f21752339905b47809a2d02408532504774507d9d7a9f0e0212a"} Apr 17 11:16:17.783281 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:17.783257 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" event={"ID":"0fb21237899613074192be5ee06d9825","Type":"ContainerStarted","Data":"18c3e06f84f9e906aebf2f43450101d4e9a5db24b7a9a7324a398754a88ee0a7"} Apr 17 11:16:17.831467 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.831423 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:17.931969 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:17.931886 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-188.ec2.internal\" not found" Apr 17 11:16:18.002286 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.002258 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:18.036481 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.036451 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" Apr 17 11:16:18.048837 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.048795 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:18.049797 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.049775 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" Apr 17 11:16:18.058652 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.058619 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:18.196110 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.196036 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:18.612252 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.611950 2580 apiserver.go:52] "Watching apiserver" Apr 17 11:16:18.621387 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.621362 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:18.623274 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.623242 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal","openshift-multus/multus-kxvmt","openshift-network-operator/iptables-alerter-wjx5n","openshift-ovn-kubernetes/ovnkube-node-nt59h","kube-system/konnectivity-agent-vjnsr","kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal","openshift-cluster-node-tuning-operator/tuned-cd864","openshift-dns/node-resolver-kgg54","openshift-multus/multus-additional-cni-plugins-fdfx7","openshift-multus/network-metrics-daemon-s9wws","openshift-network-diagnostics/network-check-target-b97qz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq","openshift-image-registry/node-ca-bfkcr"] Apr 17 11:16:18.626524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.626490 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.626524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.626519 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.627779 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.627754 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.628887 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.628866 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:18.629331 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.629308 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:18.629439 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.629416 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:18.629503 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.629459 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:18.629576 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.629558 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mbtv5\"" Apr 17 11:16:18.629633 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.629608 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:18.629734 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.629704 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:18.630182 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.630163 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.630278 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.630214 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2flrb\"" Apr 17 11:16:18.632037 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.632008 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:18.632505 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.632476 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:18.632634 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.632474 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:18.633272 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.633251 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:18.633373 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.633357 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qqsxn\"" Apr 17 11:16:18.634754 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.633633 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:18.634754 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.633981 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:18.634754 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.634292 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:18.634754 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.634339 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:18.634754 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.634492 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9sgbr\"" Apr 17 11:16:18.634754 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.634584 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s9wql\"" Apr 17 11:16:18.635438 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.634784 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:18.635438 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.634853 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.635438 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.635195 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:18.636209 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.636172 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:18.636331 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:18.636300 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:18.637430 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.637399 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:18.637524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.637509 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:18.637928 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.637907 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:18.637928 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.637923 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-77pwz\"" Apr 17 11:16:18.638296 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.638282 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:18.638379 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:18.638346 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:18.638761 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.638744 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:18.638841 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.638786 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:18.639635 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.639614 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.640944 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.640928 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.642149 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.642114 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:18.642241 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.642200 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:18.642241 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.642122 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:18.642332 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.642269 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.642332 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.642204 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q7f9b\"" Apr 17 11:16:18.643437 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.643422 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:18.643502 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.643435 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:18.644362 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.644212 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:18.644362 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.644308 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9sgx7\"" Apr 17 11:16:18.644603 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.644584 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:18.645726 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.645707 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dptdg\"" Apr 17 11:16:18.647740 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647703 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0e81b66-d7e8-4dcf-baec-e09afe76648c-host-slash\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.647740 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647738 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cnibin\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.647910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647764 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cni-binary-copy\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.647910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647789 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-kubernetes\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.647910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647811 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-kubelet\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.647910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-run-netns\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.647910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647856 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-ovn\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.647910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-tmp-dir\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.647910 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647905 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-system-cni-dir\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647930 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647957 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-var-lib-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.647980 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648003 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovnkube-config\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648026 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxqlz\" (UniqueName: \"kubernetes.io/projected/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-kube-api-access-pxqlz\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648049 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/048dc60e-f359-4a3d-b877-c94afe6b9af6-agent-certs\") pod \"konnectivity-agent-vjnsr\" (UID: \"048dc60e-f359-4a3d-b877-c94afe6b9af6\") " pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648072 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/048dc60e-f359-4a3d-b877-c94afe6b9af6-konnectivity-ca\") pod \"konnectivity-agent-vjnsr\" (UID: \"048dc60e-f359-4a3d-b877-c94afe6b9af6\") " pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648100 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysctl-d\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648159 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-lib-modules\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-tuned\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648236 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jn2l\" (UniqueName: \"kubernetes.io/projected/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-kube-api-access-9jn2l\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.648277 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648261 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-cni-netd\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-hosts-file\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648311 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysconfig\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648346 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-systemd\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648374 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-run\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648417 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-cni-bin\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648441 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovn-node-metrics-cert\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648465 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648518 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqztk\" (UniqueName: \"kubernetes.io/projected/71c35dce-5b27-4704-95a2-e390345991dc-kube-api-access-nqztk\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648546 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648576 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-systemd\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648713 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-var-lib-kubelet\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648736 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-slash\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648763 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-etc-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648792 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-log-socket\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.648816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648807 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q42p2\" (UniqueName: \"kubernetes.io/projected/b0e81b66-d7e8-4dcf-baec-e09afe76648c-kube-api-access-q42p2\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648823 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-os-release\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648868 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gfp\" (UniqueName: \"kubernetes.io/projected/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-kube-api-access-v2gfp\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648900 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-modprobe-d\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648935 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-sys\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648954 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-host\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648974 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-tmp\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.648995 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-node-log\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.649012 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-env-overrides\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.649035 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovnkube-script-lib\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.649058 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhszc\" (UniqueName: \"kubernetes.io/projected/dada1323-9bb8-41bf-87e3-fddbcc3aa159-kube-api-access-fhszc\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.649080 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysctl-conf\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.649099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-systemd-units\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.649152 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-run-ovn-kubernetes\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.649191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.649474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.649225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b0e81b66-d7e8-4dcf-baec-e09afe76648c-iptables-alerter-script\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.684801 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.684773 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:17 +0000 UTC" deadline="2028-01-20 05:46:51.795863864 +0000 UTC" Apr 17 11:16:18.684801 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.684801 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15426h30m33.111066437s" Apr 17 11:16:18.736960 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.736934 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:18.750227 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750162 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-host\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.750227 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750199 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-tmp\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.750227 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750217 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-node-log\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750242 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovnkube-script-lib\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750266 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhszc\" (UniqueName: \"kubernetes.io/projected/dada1323-9bb8-41bf-87e3-fddbcc3aa159-kube-api-access-fhszc\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750289 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0e81b66-d7e8-4dcf-baec-e09afe76648c-host-slash\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750296 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-host\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750315 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a280fe8d-c697-436b-8324-1581f46fa362-cni-binary-copy\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750354 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-etc-selinux\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750395 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-serviceca\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750411 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-kubelet\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750426 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-run-netns\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750457 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750472 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-conf-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750487 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovnkube-config\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.750524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750520 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxqlz\" (UniqueName: \"kubernetes.io/projected/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-kube-api-access-pxqlz\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750540 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/048dc60e-f359-4a3d-b877-c94afe6b9af6-konnectivity-ca\") pod \"konnectivity-agent-vjnsr\" (UID: \"048dc60e-f359-4a3d-b877-c94afe6b9af6\") " pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750557 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-sys-fs\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750572 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-system-cni-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-cnibin\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750598 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-socket-dir-parent\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750602 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0e81b66-d7e8-4dcf-baec-e09afe76648c-host-slash\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750615 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-lib-modules\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750630 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-cni-netd\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-hosts-file\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750661 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-os-release\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750676 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gfp\" (UniqueName: \"kubernetes.io/projected/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-kube-api-access-v2gfp\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750707 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-cni-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750726 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-netns\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750743 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysconfig\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-run\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.751189 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750738 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750776 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-cni-bin\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750794 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovn-node-metrics-cert\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750820 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-socket-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750835 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-os-release\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750857 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-hostroot\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750857 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovnkube-script-lib\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750902 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750934 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-node-log\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750952 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750996 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-run\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751032 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-kubelet\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.750997 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysconfig\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751078 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-device-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751110 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-run-netns\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751175 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-cni-bin\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.751930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751566 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751581 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751627 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751650 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovnkube-config\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751628 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-multus-certs\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751695 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-hosts-file\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751702 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-var-lib-kubelet\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751737 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-slash\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751746 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-lib-modules\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751750 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-var-lib-kubelet\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751655 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/048dc60e-f359-4a3d-b877-c94afe6b9af6-konnectivity-ca\") pod \"konnectivity-agent-vjnsr\" (UID: \"048dc60e-f359-4a3d-b877-c94afe6b9af6\") " pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751766 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-etc-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751783 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-cni-netd\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751806 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-log-socket\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751839 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-cni-bin\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751806 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-slash\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751876 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-modprobe-d\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.752907 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751894 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-etc-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-os-release\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751900 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-sys\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-log-socket\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751938 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-env-overrides\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751942 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751974 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b0e81b66-d7e8-4dcf-baec-e09afe76648c-iptables-alerter-script\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-modprobe-d\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.751999 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-sys\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752027 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-registration-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752046 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysctl-conf\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752080 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-systemd-units\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752109 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-run-ovn-kubernetes\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752170 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-host-run-ovn-kubernetes\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752173 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-systemd-units\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cnibin\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752279 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysctl-conf\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.753777 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cni-binary-copy\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752316 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cnibin\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752331 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dada1323-9bb8-41bf-87e3-fddbcc3aa159-env-overrides\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752357 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-kubernetes\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752400 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-ovn\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752461 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-kubernetes\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752501 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-tmp-dir\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752533 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-ovn\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752556 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b0e81b66-d7e8-4dcf-baec-e09afe76648c-iptables-alerter-script\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752576 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-system-cni-dir\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752742 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-cni-multus\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752757 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-cni-binary-copy\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752774 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-kubelet\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nff55\" (UniqueName: \"kubernetes.io/projected/a280fe8d-c697-436b-8324-1581f46fa362-kube-api-access-nff55\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752803 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-tmp-dir\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752810 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-system-cni-dir\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752839 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-var-lib-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.754625 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752869 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-var-lib-openvswitch\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752872 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/048dc60e-f359-4a3d-b877-c94afe6b9af6-agent-certs\") pod \"konnectivity-agent-vjnsr\" (UID: \"048dc60e-f359-4a3d-b877-c94afe6b9af6\") " pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752925 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysctl-d\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752959 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-tuned\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.752987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-sysctl-d\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jn2l\" (UniqueName: \"kubernetes.io/projected/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-kube-api-access-9jn2l\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753055 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqztk\" (UniqueName: \"kubernetes.io/projected/71c35dce-5b27-4704-95a2-e390345991dc-kube-api-access-nqztk\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753081 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-systemd\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753124 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753222 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-etc-kubernetes\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753279 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-systemd\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753315 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-systemd\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753346 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q42p2\" (UniqueName: \"kubernetes.io/projected/b0e81b66-d7e8-4dcf-baec-e09afe76648c-kube-api-access-q42p2\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753375 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dada1323-9bb8-41bf-87e3-fddbcc3aa159-run-systemd\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:18.753417 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753425 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcr7z\" (UniqueName: \"kubernetes.io/projected/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-kube-api-access-wcr7z\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.755150 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:18.753611 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:16:19.253557879 +0000 UTC m=+3.023433036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:18.755743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753634 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-host\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.755743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgwmh\" (UniqueName: \"kubernetes.io/projected/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-kube-api-access-pgwmh\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.755743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753687 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-k8s-cni-cncf-io\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.755743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.753722 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a280fe8d-c697-436b-8324-1581f46fa362-multus-daemon-config\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.755743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.754496 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-tmp\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.755743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.754597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dada1323-9bb8-41bf-87e3-fddbcc3aa159-ovn-node-metrics-cert\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.755743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.755226 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/048dc60e-f359-4a3d-b877-c94afe6b9af6-agent-certs\") pod \"konnectivity-agent-vjnsr\" (UID: \"048dc60e-f359-4a3d-b877-c94afe6b9af6\") " pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:18.755743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.755473 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-etc-tuned\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.762984 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:18.762948 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:18.762984 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:18.762972 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:18.762984 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:18.762984 2580 projected.go:194] Error preparing data for projected volume kube-api-access-29pzs for pod openshift-network-diagnostics/network-check-target-b97qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:18.762984 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:18.763055 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs podName:37df9c48-6708-4b3b-9cca-a6c82f4f253f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:19.26303653 +0000 UTC m=+3.032911686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-29pzs" (UniqueName: "kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs") pod "network-check-target-b97qz" (UID: "37df9c48-6708-4b3b-9cca-a6c82f4f253f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:18.765105 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.765080 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gfp\" (UniqueName: \"kubernetes.io/projected/f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d-kube-api-access-v2gfp\") pod \"multus-additional-cni-plugins-fdfx7\" (UID: \"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d\") " pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.765203 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.765180 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxqlz\" (UniqueName: \"kubernetes.io/projected/6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb-kube-api-access-pxqlz\") pod \"node-resolver-kgg54\" (UID: \"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb\") " pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.766215 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.766192 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhszc\" (UniqueName: \"kubernetes.io/projected/dada1323-9bb8-41bf-87e3-fddbcc3aa159-kube-api-access-fhszc\") pod \"ovnkube-node-nt59h\" (UID: \"dada1323-9bb8-41bf-87e3-fddbcc3aa159\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.766303 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.766197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jn2l\" (UniqueName: \"kubernetes.io/projected/e4eb1882-8f3f-45aa-bdf1-10c1296b7af5-kube-api-access-9jn2l\") pod \"tuned-cd864\" (UID: \"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5\") " pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.766960 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.766942 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqztk\" (UniqueName: \"kubernetes.io/projected/71c35dce-5b27-4704-95a2-e390345991dc-kube-api-access-nqztk\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:18.767501 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.767484 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q42p2\" (UniqueName: \"kubernetes.io/projected/b0e81b66-d7e8-4dcf-baec-e09afe76648c-kube-api-access-q42p2\") pod \"iptables-alerter-wjx5n\" (UID: \"b0e81b66-d7e8-4dcf-baec-e09afe76648c\") " pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.806321 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.806290 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:18.854686 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-cni-multus\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.854840 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854696 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-kubelet\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.854840 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854723 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nff55\" (UniqueName: \"kubernetes.io/projected/a280fe8d-c697-436b-8324-1581f46fa362-kube-api-access-nff55\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.854840 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854778 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-etc-kubernetes\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.854840 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854778 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-cni-multus\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.854840 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854805 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcr7z\" (UniqueName: \"kubernetes.io/projected/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-kube-api-access-wcr7z\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.854840 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854831 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-host\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854854 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgwmh\" (UniqueName: \"kubernetes.io/projected/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-kube-api-access-pgwmh\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854881 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-k8s-cni-cncf-io\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a280fe8d-c697-436b-8324-1581f46fa362-multus-daemon-config\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854935 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a280fe8d-c697-436b-8324-1581f46fa362-cni-binary-copy\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854959 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-etc-selinux\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.854984 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-serviceca\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855012 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-conf-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855040 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-sys-fs\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855063 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-system-cni-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855086 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-cnibin\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855111 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-socket-dir-parent\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855159 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-cni-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-netns\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-kubelet\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855214 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-socket-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855215 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-sys-fs\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855246 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-etc-selinux\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855257 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-os-release\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855289 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-netns\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855294 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-cnibin\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855343 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-socket-dir-parent\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855376 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-hostroot\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-hostroot\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-socket-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855386 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-etc-kubernetes\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855345 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-os-release\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855415 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-host\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.855525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855347 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-system-cni-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855416 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855457 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-device-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-multus-certs\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855492 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855515 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-cni-bin\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855458 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-cni-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855536 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-k8s-cni-cncf-io\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-registration-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855569 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-device-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855586 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-run-multus-certs\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855595 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-host-var-lib-cni-bin\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855614 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-registration-dir\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855655 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a280fe8d-c697-436b-8324-1581f46fa362-multus-conf-dir\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855672 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-serviceca\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.855808 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a280fe8d-c697-436b-8324-1581f46fa362-cni-binary-copy\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.856409 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.856119 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a280fe8d-c697-436b-8324-1581f46fa362-multus-daemon-config\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.863854 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.863798 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcr7z\" (UniqueName: \"kubernetes.io/projected/ae83b4c3-ff01-4e96-ba32-5de3e77ba49e-kube-api-access-wcr7z\") pod \"aws-ebs-csi-driver-node-fngdq\" (UID: \"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.866297 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.864281 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgwmh\" (UniqueName: \"kubernetes.io/projected/2604f0e7-0ee7-4d02-adf7-f046ecf35e36-kube-api-access-pgwmh\") pod \"node-ca-bfkcr\" (UID: \"2604f0e7-0ee7-4d02-adf7-f046ecf35e36\") " pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.866297 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.864324 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nff55\" (UniqueName: \"kubernetes.io/projected/a280fe8d-c697-436b-8324-1581f46fa362-kube-api-access-nff55\") pod \"multus-kxvmt\" (UID: \"a280fe8d-c697-436b-8324-1581f46fa362\") " pod="openshift-multus/multus-kxvmt" Apr 17 11:16:18.939112 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.939077 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cd864" Apr 17 11:16:18.947840 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.947811 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wjx5n" Apr 17 11:16:18.958110 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.958089 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:18.963697 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.963675 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:18.971322 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.971296 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kgg54" Apr 17 11:16:18.978962 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.978943 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" Apr 17 11:16:18.986232 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.986206 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" Apr 17 11:16:18.992770 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.992751 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bfkcr" Apr 17 11:16:18.998264 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:18.998244 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kxvmt" Apr 17 11:16:19.258333 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.258270 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:19.258471 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:19.258383 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:19.258471 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:19.258433 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.258420419 +0000 UTC m=+4.028295554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:19.274846 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.274816 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod048dc60e_f359_4a3d_b877_c94afe6b9af6.slice/crio-c56f1303124519f2f08e990299eb86f065e10877599b64226f6eab7f0a91284a WatchSource:0}: Error finding container c56f1303124519f2f08e990299eb86f065e10877599b64226f6eab7f0a91284a: Status 404 returned error can't find the container with id c56f1303124519f2f08e990299eb86f065e10877599b64226f6eab7f0a91284a Apr 17 11:16:19.275815 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.275793 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddada1323_9bb8_41bf_87e3_fddbcc3aa159.slice/crio-6f776dade93cd3d654c72e047df4f3ead676ab7c1e779db0d971822e83d57519 WatchSource:0}: Error finding container 6f776dade93cd3d654c72e047df4f3ead676ab7c1e779db0d971822e83d57519: Status 404 returned error can't find the container with id 6f776dade93cd3d654c72e047df4f3ead676ab7c1e779db0d971822e83d57519 Apr 17 11:16:19.277192 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.277170 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf53bbbe9_bf96_40ee_8b77_64c3d3d8d96d.slice/crio-3b5d839631f0efbd7c11cbadff4375a5baefde939fdf17752955177abb9e60c0 WatchSource:0}: Error finding container 3b5d839631f0efbd7c11cbadff4375a5baefde939fdf17752955177abb9e60c0: Status 404 returned error can't find the container with id 3b5d839631f0efbd7c11cbadff4375a5baefde939fdf17752955177abb9e60c0 Apr 17 11:16:19.282769 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.282741 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae83b4c3_ff01_4e96_ba32_5de3e77ba49e.slice/crio-4f045e9320dc035c7ce0a3f9a5a073500d5e9b51e9b9b087d444413b15491056 WatchSource:0}: Error finding container 4f045e9320dc035c7ce0a3f9a5a073500d5e9b51e9b9b087d444413b15491056: Status 404 returned error can't find the container with id 4f045e9320dc035c7ce0a3f9a5a073500d5e9b51e9b9b087d444413b15491056 Apr 17 11:16:19.283662 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.283638 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2604f0e7_0ee7_4d02_adf7_f046ecf35e36.slice/crio-bf4be91775c3ffcadd0956cc3f1be20cd90c62007fda6c2bac68d702d8c35ece WatchSource:0}: Error finding container bf4be91775c3ffcadd0956cc3f1be20cd90c62007fda6c2bac68d702d8c35ece: Status 404 returned error can't find the container with id bf4be91775c3ffcadd0956cc3f1be20cd90c62007fda6c2bac68d702d8c35ece Apr 17 11:16:19.284236 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.284213 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0e81b66_d7e8_4dcf_baec_e09afe76648c.slice/crio-30cafe1d404b0d08f3c28ec8231a3e74af2a9de0452c4b67c51d63a63c76548e WatchSource:0}: Error finding container 30cafe1d404b0d08f3c28ec8231a3e74af2a9de0452c4b67c51d63a63c76548e: Status 404 returned error can't find the container with id 30cafe1d404b0d08f3c28ec8231a3e74af2a9de0452c4b67c51d63a63c76548e Apr 17 11:16:19.285434 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.285400 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4eb1882_8f3f_45aa_bdf1_10c1296b7af5.slice/crio-aae60a3cd43e7342cb76df257d49381d99ec357fc56356e334017fee2ab3de08 WatchSource:0}: Error finding container aae60a3cd43e7342cb76df257d49381d99ec357fc56356e334017fee2ab3de08: Status 404 returned error can't find the container with id aae60a3cd43e7342cb76df257d49381d99ec357fc56356e334017fee2ab3de08 Apr 17 11:16:19.286344 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.286259 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda280fe8d_c697_436b_8324_1581f46fa362.slice/crio-054519ca0ff7e9ad0f77435ea9e4968fe4b2ed12c482ede778418be27614b554 WatchSource:0}: Error finding container 054519ca0ff7e9ad0f77435ea9e4968fe4b2ed12c482ede778418be27614b554: Status 404 returned error can't find the container with id 054519ca0ff7e9ad0f77435ea9e4968fe4b2ed12c482ede778418be27614b554 Apr 17 11:16:19.289230 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:19.289207 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6830c0fb_a8f5_4cb8_a8a1_5f307a010dcb.slice/crio-c9f594646aea85f0d4143dae6ef1bba594e76baa5b8f8fa2e5a2c96f7b29b17c WatchSource:0}: Error finding container c9f594646aea85f0d4143dae6ef1bba594e76baa5b8f8fa2e5a2c96f7b29b17c: Status 404 returned error can't find the container with id c9f594646aea85f0d4143dae6ef1bba594e76baa5b8f8fa2e5a2c96f7b29b17c Apr 17 11:16:19.358924 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.358775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:19.359037 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:19.358923 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:19.359037 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:19.358941 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:19.359037 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:19.358952 2580 projected.go:194] Error preparing data for projected volume kube-api-access-29pzs for pod openshift-network-diagnostics/network-check-target-b97qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:19.359037 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:19.359021 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs podName:37df9c48-6708-4b3b-9cca-a6c82f4f253f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.359002996 +0000 UTC m=+4.128878136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-29pzs" (UniqueName: "kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs") pod "network-check-target-b97qz" (UID: "37df9c48-6708-4b3b-9cca-a6c82f4f253f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:19.685621 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.685502 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:17 +0000 UTC" deadline="2027-12-05 21:02:18.699252023 +0000 UTC" Apr 17 11:16:19.685621 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.685543 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14337h45m59.013713404s" Apr 17 11:16:19.792249 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.792202 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kxvmt" event={"ID":"a280fe8d-c697-436b-8324-1581f46fa362","Type":"ContainerStarted","Data":"054519ca0ff7e9ad0f77435ea9e4968fe4b2ed12c482ede778418be27614b554"} Apr 17 11:16:19.794177 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.793989 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bfkcr" event={"ID":"2604f0e7-0ee7-4d02-adf7-f046ecf35e36","Type":"ContainerStarted","Data":"bf4be91775c3ffcadd0956cc3f1be20cd90c62007fda6c2bac68d702d8c35ece"} Apr 17 11:16:19.796677 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.796641 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" event={"ID":"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e","Type":"ContainerStarted","Data":"4f045e9320dc035c7ce0a3f9a5a073500d5e9b51e9b9b087d444413b15491056"} Apr 17 11:16:19.801396 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.801371 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"6f776dade93cd3d654c72e047df4f3ead676ab7c1e779db0d971822e83d57519"} Apr 17 11:16:19.817943 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.817902 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cd864" event={"ID":"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5","Type":"ContainerStarted","Data":"aae60a3cd43e7342cb76df257d49381d99ec357fc56356e334017fee2ab3de08"} Apr 17 11:16:19.830403 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.830370 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wjx5n" event={"ID":"b0e81b66-d7e8-4dcf-baec-e09afe76648c","Type":"ContainerStarted","Data":"30cafe1d404b0d08f3c28ec8231a3e74af2a9de0452c4b67c51d63a63c76548e"} Apr 17 11:16:19.834344 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.834294 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" event={"ID":"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d","Type":"ContainerStarted","Data":"3b5d839631f0efbd7c11cbadff4375a5baefde939fdf17752955177abb9e60c0"} Apr 17 11:16:19.835987 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.835963 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vjnsr" event={"ID":"048dc60e-f359-4a3d-b877-c94afe6b9af6","Type":"ContainerStarted","Data":"c56f1303124519f2f08e990299eb86f065e10877599b64226f6eab7f0a91284a"} Apr 17 11:16:19.842660 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.842623 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" event={"ID":"0fb21237899613074192be5ee06d9825","Type":"ContainerStarted","Data":"4eca43842a130ac41025fbd922043c88604aeae6c20b892d79111cb313b42afb"} Apr 17 11:16:19.846118 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.846095 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:19.851206 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:19.851182 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kgg54" event={"ID":"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb","Type":"ContainerStarted","Data":"c9f594646aea85f0d4143dae6ef1bba594e76baa5b8f8fa2e5a2c96f7b29b17c"} Apr 17 11:16:20.266396 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:20.266351 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:20.266581 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:20.266543 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:20.266644 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:20.266611 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:16:22.266589752 +0000 UTC m=+6.036464906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:20.368067 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:20.367374 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:20.368067 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:20.367613 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:20.368067 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:20.367634 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:20.368067 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:20.367647 2580 projected.go:194] Error preparing data for projected volume kube-api-access-29pzs for pod openshift-network-diagnostics/network-check-target-b97qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:20.368067 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:20.367709 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs podName:37df9c48-6708-4b3b-9cca-a6c82f4f253f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:22.367687827 +0000 UTC m=+6.137562965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-29pzs" (UniqueName: "kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs") pod "network-check-target-b97qz" (UID: "37df9c48-6708-4b3b-9cca-a6c82f4f253f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:20.789932 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:20.789292 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:20.789932 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:20.789403 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:20.789932 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:20.789776 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:20.789932 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:20.789866 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:20.857833 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:20.857796 2580 generic.go:358] "Generic (PLEG): container finished" podID="ac675744bd931602b0121ca520dff9a8" containerID="19142034c936cfaf2b59271e04e2bb0f8841a48ae31f87a0c792b1b443f37c1d" exitCode=0 Apr 17 11:16:20.858508 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:20.858482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" event={"ID":"ac675744bd931602b0121ca520dff9a8","Type":"ContainerDied","Data":"19142034c936cfaf2b59271e04e2bb0f8841a48ae31f87a0c792b1b443f37c1d"} Apr 17 11:16:20.878178 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:20.878111 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-188.ec2.internal" podStartSLOduration=2.878093335 podStartE2EDuration="2.878093335s" podCreationTimestamp="2026-04-17 11:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:19.856428903 +0000 UTC m=+3.626304041" watchObservedRunningTime="2026-04-17 11:16:20.878093335 +0000 UTC m=+4.647968493" Apr 17 11:16:21.869947 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:21.869909 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" event={"ID":"ac675744bd931602b0121ca520dff9a8","Type":"ContainerStarted","Data":"66601887eefebab24dd2b564300f9ef1e186a2e2aab05c47c84e289bd1a1c254"} Apr 17 11:16:22.284927 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:22.284889 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:22.285125 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:22.285044 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:22.285125 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:22.285107 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.285087597 +0000 UTC m=+10.054962738 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:22.385646 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:22.385600 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:22.385828 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:22.385798 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:22.385900 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:22.385837 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:22.385900 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:22.385851 2580 projected.go:194] Error preparing data for projected volume kube-api-access-29pzs for pod openshift-network-diagnostics/network-check-target-b97qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:22.386002 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:22.385915 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs podName:37df9c48-6708-4b3b-9cca-a6c82f4f253f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.385895726 +0000 UTC m=+10.155770874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-29pzs" (UniqueName: "kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs") pod "network-check-target-b97qz" (UID: "37df9c48-6708-4b3b-9cca-a6c82f4f253f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:22.781725 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:22.781688 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:22.781897 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:22.781834 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:22.782396 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:22.782234 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:22.782396 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:22.782335 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:24.779700 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:24.779655 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:24.780213 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:24.779789 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:24.780297 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:24.780272 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:24.780411 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:24.780367 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:26.315842 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:26.315805 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:26.316277 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:26.315989 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:26.316277 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:26.316065 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:16:34.316041639 +0000 UTC m=+18.085916775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:26.416954 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:26.416914 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:26.417128 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:26.417110 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:26.417249 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:26.417152 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:26.417249 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:26.417166 2580 projected.go:194] Error preparing data for projected volume kube-api-access-29pzs for pod openshift-network-diagnostics/network-check-target-b97qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:26.417249 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:26.417226 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs podName:37df9c48-6708-4b3b-9cca-a6c82f4f253f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:34.417205416 +0000 UTC m=+18.187080559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-29pzs" (UniqueName: "kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs") pod "network-check-target-b97qz" (UID: "37df9c48-6708-4b3b-9cca-a6c82f4f253f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:26.781842 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:26.781810 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:26.782023 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:26.781923 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:26.782730 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:26.782325 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:26.782730 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:26.782410 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:28.778832 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:28.778794 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:28.779250 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:28.778802 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:28.779250 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:28.778930 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:28.779250 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:28.778995 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:30.779896 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:30.779858 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:30.779896 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:30.779887 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:30.780640 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:30.779985 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:30.780640 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:30.780120 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:32.782305 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:32.782266 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:32.782747 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:32.782281 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:32.782747 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:32.782374 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:32.782747 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:32.782492 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:34.380562 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:34.380518 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:34.380995 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:34.380693 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:34.380995 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:34.380760 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:16:50.380743168 +0000 UTC m=+34.150618304 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:34.481852 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:34.481807 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:34.482009 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:34.481966 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:34.482009 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:34.481990 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:34.482009 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:34.482003 2580 projected.go:194] Error preparing data for projected volume kube-api-access-29pzs for pod openshift-network-diagnostics/network-check-target-b97qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:34.482107 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:34.482067 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs podName:37df9c48-6708-4b3b-9cca-a6c82f4f253f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:50.482048356 +0000 UTC m=+34.251923502 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-29pzs" (UniqueName: "kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs") pod "network-check-target-b97qz" (UID: "37df9c48-6708-4b3b-9cca-a6c82f4f253f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:34.781606 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:34.781579 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:34.781789 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:34.781579 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:34.781789 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:34.781716 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:34.781789 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:34.781753 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:36.782243 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:36.782209 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:36.782574 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:36.782222 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:36.782574 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:36.782336 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:36.782574 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:36.782393 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:37.913889 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.913506 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vjnsr" event={"ID":"048dc60e-f359-4a3d-b877-c94afe6b9af6","Type":"ContainerStarted","Data":"cfa4afed9f3d5c5af9c7627fc56c41f9c56968fdd8ca96202f8774163fcc9f76"} Apr 17 11:16:37.914807 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.914785 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kgg54" event={"ID":"6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb","Type":"ContainerStarted","Data":"c061c9d682cdc903fc3e5bb299a6cbc79d44459dc91762212996568a28c33ab4"} Apr 17 11:16:37.916018 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.915998 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kxvmt" event={"ID":"a280fe8d-c697-436b-8324-1581f46fa362","Type":"ContainerStarted","Data":"81f2e98de0c6f79cd534b46ac9aa87c06cc85efad007547bf74c9b65e835eded"} Apr 17 11:16:37.919865 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.919842 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bfkcr" event={"ID":"2604f0e7-0ee7-4d02-adf7-f046ecf35e36","Type":"ContainerStarted","Data":"4b1caccbe499ca7bcd7cf8d97fc09f1cbd9ade261d2a717be886b3ffeb5ccb3b"} Apr 17 11:16:37.921173 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.921154 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" event={"ID":"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e","Type":"ContainerStarted","Data":"ea3812f156287cc56d03ac90aeed7d1431a725c084721b515ce9e4c1d4069bb2"} Apr 17 11:16:37.923417 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.923397 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"3a7c0b47914676c39b5078d6d1344b587d627f5330d3a5ac35e95dd960f2033b"} Apr 17 11:16:37.923499 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.923421 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"1bbcf9c968ca17baef7f95ae281bdce46666219b3f5252452246e5490a895455"} Apr 17 11:16:37.923499 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.923430 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"72a2b2128a814dfc6d51fa0ef2d24b2cf2f3baa5f5e45f073df4128fbed015e8"} Apr 17 11:16:37.923499 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.923438 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"a0fbb11eb82223e356e15ea6dd6698a6efa8ac6642b82eecfeb40496f4131662"} Apr 17 11:16:37.923499 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.923450 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"f426f0cb23d135058830d4ea7a0a10da28f150ef91daa50c48343a7e99ff7d38"} Apr 17 11:16:37.924638 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.924620 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cd864" event={"ID":"e4eb1882-8f3f-45aa-bdf1-10c1296b7af5","Type":"ContainerStarted","Data":"b24276a0a6371906665ac5f5084bf15caabd1469a12b0255c8390025b937dc21"} Apr 17 11:16:37.925885 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.925863 2580 generic.go:358] "Generic (PLEG): container finished" podID="f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d" containerID="4f449485aac4534db71acefb36b0a639e7693b8af4bb730b397c89df5aef38a2" exitCode=0 Apr 17 11:16:37.925964 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.925897 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" event={"ID":"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d","Type":"ContainerDied","Data":"4f449485aac4534db71acefb36b0a639e7693b8af4bb730b397c89df5aef38a2"} Apr 17 11:16:37.936110 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.936065 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-188.ec2.internal" podStartSLOduration=19.936053318 podStartE2EDuration="19.936053318s" podCreationTimestamp="2026-04-17 11:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:21.886881276 +0000 UTC m=+5.656756436" watchObservedRunningTime="2026-04-17 11:16:37.936053318 +0000 UTC m=+21.705928475" Apr 17 11:16:37.967593 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.967549 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vjnsr" podStartSLOduration=4.397700953 podStartE2EDuration="21.967536111s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.276392257 +0000 UTC m=+3.046267393" lastFinishedPulling="2026-04-17 11:16:36.8462274 +0000 UTC m=+20.616102551" observedRunningTime="2026-04-17 11:16:37.935894127 +0000 UTC m=+21.705769284" watchObservedRunningTime="2026-04-17 11:16:37.967536111 +0000 UTC m=+21.737411321" Apr 17 11:16:37.985250 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:37.985203 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cd864" podStartSLOduration=4.407682515 podStartE2EDuration="21.985189966s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.287509279 +0000 UTC m=+3.057384417" lastFinishedPulling="2026-04-17 11:16:36.865016727 +0000 UTC m=+20.634891868" observedRunningTime="2026-04-17 11:16:37.984703253 +0000 UTC m=+21.754578411" watchObservedRunningTime="2026-04-17 11:16:37.985189966 +0000 UTC m=+21.755065124" Apr 17 11:16:38.001648 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.001604 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kgg54" podStartSLOduration=4.427759357 podStartE2EDuration="22.00159096s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.29118166 +0000 UTC m=+3.061056811" lastFinishedPulling="2026-04-17 11:16:36.865013273 +0000 UTC m=+20.634888414" observedRunningTime="2026-04-17 11:16:38.001345854 +0000 UTC m=+21.771221015" watchObservedRunningTime="2026-04-17 11:16:38.00159096 +0000 UTC m=+21.771466117" Apr 17 11:16:38.037828 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.037775 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bfkcr" podStartSLOduration=4.457958326 podStartE2EDuration="22.037756141s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.285164033 +0000 UTC m=+3.055039175" lastFinishedPulling="2026-04-17 11:16:36.864961841 +0000 UTC m=+20.634836990" observedRunningTime="2026-04-17 11:16:38.018667235 +0000 UTC m=+21.788542393" watchObservedRunningTime="2026-04-17 11:16:38.037756141 +0000 UTC m=+21.807631300" Apr 17 11:16:38.037937 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.037895 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kxvmt" podStartSLOduration=3.432925515 podStartE2EDuration="21.037888945s" podCreationTimestamp="2026-04-17 11:16:17 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.288419533 +0000 UTC m=+3.058294672" lastFinishedPulling="2026-04-17 11:16:36.893382954 +0000 UTC m=+20.663258102" observedRunningTime="2026-04-17 11:16:38.037503692 +0000 UTC m=+21.807378849" watchObservedRunningTime="2026-04-17 11:16:38.037888945 +0000 UTC m=+21.807764103" Apr 17 11:16:38.181384 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.181357 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:38.711986 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.711850 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:38.18137656Z","UUID":"b5e8fd02-3c8e-4a1c-aa96-dca5cd82db8d","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:38.713839 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.713814 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:38.713979 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.713847 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:38.783150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.783108 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:38.783150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.783123 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:38.783335 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:38.783255 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:38.783385 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:38.783367 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:38.929963 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.929928 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" event={"ID":"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e","Type":"ContainerStarted","Data":"7bb5656888ea7ba611bc8d03f999fe9ddb5e6ed941d29115290e8d083c8f10b3"} Apr 17 11:16:38.933343 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.933305 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"61c50a50a88101ec06ee9a496e8d18c223c36ca3dc98880490bdfdbfa2247a95"} Apr 17 11:16:38.935051 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.934953 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wjx5n" event={"ID":"b0e81b66-d7e8-4dcf-baec-e09afe76648c","Type":"ContainerStarted","Data":"e4304aff167e0467db14a68d856e184564e8e25927f7dd6e340e49b912c99704"} Apr 17 11:16:38.952802 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.952751 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wjx5n" podStartSLOduration=5.374111861 podStartE2EDuration="22.952735682s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.286222408 +0000 UTC m=+3.056097548" lastFinishedPulling="2026-04-17 11:16:36.864846228 +0000 UTC m=+20.634721369" observedRunningTime="2026-04-17 11:16:38.952049128 +0000 UTC m=+22.721924286" watchObservedRunningTime="2026-04-17 11:16:38.952735682 +0000 UTC m=+22.722610839" Apr 17 11:16:38.957231 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.957187 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:38.958179 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:38.958122 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:39.939382 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:39.939335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" event={"ID":"ae83b4c3-ff01-4e96-ba32-5de3e77ba49e","Type":"ContainerStarted","Data":"43f40f8d9274a990fb644807a8e874ec27b0c290b3338e9733bb1e80e5a10907"} Apr 17 11:16:39.940005 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:39.939502 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:39.940177 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:39.940158 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vjnsr" Apr 17 11:16:39.957832 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:39.957778 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fngdq" podStartSLOduration=4.1349421379999995 podStartE2EDuration="23.957765186s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.284796144 +0000 UTC m=+3.054671294" lastFinishedPulling="2026-04-17 11:16:39.107619193 +0000 UTC m=+22.877494342" observedRunningTime="2026-04-17 11:16:39.957620581 +0000 UTC m=+23.727495816" watchObservedRunningTime="2026-04-17 11:16:39.957765186 +0000 UTC m=+23.727640345" Apr 17 11:16:40.782423 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:40.782390 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:40.782589 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:40.782390 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:40.782589 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:40.782500 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:40.782589 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:40.782562 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:40.945079 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:40.945032 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"7a2f2cd7362c76d4a7b598411692572356ac4c0758bc9add0cc0ab987e1d7860"} Apr 17 11:16:42.782017 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:42.781826 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:42.782525 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:42.781826 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:42.782525 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:42.782116 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:42.782525 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:42.782166 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:42.952156 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:42.952100 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" event={"ID":"dada1323-9bb8-41bf-87e3-fddbcc3aa159","Type":"ContainerStarted","Data":"636968df210da3d7fe316f525798d3120cfa759b8c726adeb3b9e4ffccd90c7f"} Apr 17 11:16:42.952445 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:42.952426 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:42.953755 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:42.953729 2580 generic.go:358] "Generic (PLEG): container finished" podID="f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d" containerID="58d1a959f065990fbafbec863cac2fd94d6a68b684c23d104a4c1b46f4914c6c" exitCode=0 Apr 17 11:16:42.953858 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:42.953778 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" event={"ID":"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d","Type":"ContainerDied","Data":"58d1a959f065990fbafbec863cac2fd94d6a68b684c23d104a4c1b46f4914c6c"} Apr 17 11:16:42.968849 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:42.968824 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:42.992319 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:42.992275 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" podStartSLOduration=8.973185102 podStartE2EDuration="26.992261373s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.279321952 +0000 UTC m=+3.049197091" lastFinishedPulling="2026-04-17 11:16:37.298398225 +0000 UTC m=+21.068273362" observedRunningTime="2026-04-17 11:16:42.990656141 +0000 UTC m=+26.760531297" watchObservedRunningTime="2026-04-17 11:16:42.992261373 +0000 UTC m=+26.762136531" Apr 17 11:16:43.958506 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:43.958347 2580 generic.go:358] "Generic (PLEG): container finished" podID="f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d" containerID="8e4278afe2678d920dc5f09a0761ce9cc4cd5ae02ae727f6d5bc384eb734a90d" exitCode=0 Apr 17 11:16:43.958506 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:43.958422 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" event={"ID":"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d","Type":"ContainerDied","Data":"8e4278afe2678d920dc5f09a0761ce9cc4cd5ae02ae727f6d5bc384eb734a90d"} Apr 17 11:16:43.959299 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:43.959214 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:43.959299 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:43.959241 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:43.974188 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:43.974167 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:16:44.135276 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:44.135244 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b97qz"] Apr 17 11:16:44.135415 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:44.135369 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:44.135479 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:44.135461 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:44.139017 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:44.138993 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s9wws"] Apr 17 11:16:44.139158 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:44.139109 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:44.139257 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:44.139236 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:44.962743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:44.962713 2580 generic.go:358] "Generic (PLEG): container finished" podID="f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d" containerID="4fab2d32a15c1ec65f65bbc957cd923f13d216c4d03c07ad11d55d5ba043c10e" exitCode=0 Apr 17 11:16:44.963316 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:44.962807 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" event={"ID":"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d","Type":"ContainerDied","Data":"4fab2d32a15c1ec65f65bbc957cd923f13d216c4d03c07ad11d55d5ba043c10e"} Apr 17 11:16:45.779388 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:45.779351 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:45.779598 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:45.779404 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:45.779598 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:45.779521 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:45.779713 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:45.779640 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:47.779381 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:47.779344 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:47.780028 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:47.779347 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:47.780028 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:47.779507 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:47.780028 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:47.779563 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:49.779219 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:49.779181 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:49.779667 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:49.779183 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:49.779667 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:49.779317 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b97qz" podUID="37df9c48-6708-4b3b-9cca-a6c82f4f253f" Apr 17 11:16:49.779667 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:49.779422 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:16:50.099943 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.099740 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-188.ec2.internal" event="NodeReady" Apr 17 11:16:50.100179 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.100082 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:50.178749 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.178719 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mwrb7"] Apr 17 11:16:50.202850 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.202807 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b4df6"] Apr 17 11:16:50.203020 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.202976 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.205953 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.205930 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:50.206171 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.206126 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvwdc\"" Apr 17 11:16:50.206261 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.206238 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:50.218421 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.218400 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mwrb7"] Apr 17 11:16:50.218421 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.218424 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b4df6"] Apr 17 11:16:50.218595 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.218528 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:50.221531 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.221507 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:50.221652 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.221536 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:50.221652 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.221601 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:50.221873 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.221852 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-whkkk\"" Apr 17 11:16:50.295428 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.295385 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnbb6\" (UniqueName: \"kubernetes.io/projected/0e6ca43a-9f71-4557-be86-206743aee65b-kube-api-access-qnbb6\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:50.295626 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.295463 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.295626 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.295493 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hrbh\" (UniqueName: \"kubernetes.io/projected/7991569c-ec27-417a-8b37-b1129ca90932-kube-api-access-8hrbh\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.295626 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.295521 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7991569c-ec27-417a-8b37-b1129ca90932-tmp-dir\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.295626 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.295560 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7991569c-ec27-417a-8b37-b1129ca90932-config-volume\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.295626 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.295585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:50.396733 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.396607 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7991569c-ec27-417a-8b37-b1129ca90932-tmp-dir\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.396733 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.396664 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7991569c-ec27-417a-8b37-b1129ca90932-config-volume\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.396733 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.396690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:50.396733 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.396719 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.396758 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnbb6\" (UniqueName: \"kubernetes.io/projected/0e6ca43a-9f71-4557-be86-206743aee65b-kube-api-access-qnbb6\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.396810 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.396812 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.396833 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrbh\" (UniqueName: \"kubernetes.io/projected/7991569c-ec27-417a-8b37-b1129ca90932-kube-api-access-8hrbh\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.396890 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert podName:0e6ca43a-9f71-4557-be86-206743aee65b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:50.896868282 +0000 UTC m=+34.666743454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert") pod "ingress-canary-b4df6" (UID: "0e6ca43a-9f71-4557-be86-206743aee65b") : secret "canary-serving-cert" not found Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.396893 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.396954 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:17:22.396934324 +0000 UTC m=+66.166809471 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.397015 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.397019 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7991569c-ec27-417a-8b37-b1129ca90932-tmp-dir\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.397081 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.397049 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls podName:7991569c-ec27-417a-8b37-b1129ca90932 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:50.897038306 +0000 UTC m=+34.666913460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls") pod "dns-default-mwrb7" (UID: "7991569c-ec27-417a-8b37-b1129ca90932") : secret "dns-default-metrics-tls" not found Apr 17 11:16:50.397616 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.397302 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7991569c-ec27-417a-8b37-b1129ca90932-config-volume\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.410422 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.410390 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hrbh\" (UniqueName: \"kubernetes.io/projected/7991569c-ec27-417a-8b37-b1129ca90932-kube-api-access-8hrbh\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.410607 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.410479 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnbb6\" (UniqueName: \"kubernetes.io/projected/0e6ca43a-9f71-4557-be86-206743aee65b-kube-api-access-qnbb6\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:50.498228 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.498184 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:50.498428 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.498376 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:50.498428 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.498403 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:50.498428 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.498413 2580 projected.go:194] Error preparing data for projected volume kube-api-access-29pzs for pod openshift-network-diagnostics/network-check-target-b97qz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:50.498591 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.498484 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs podName:37df9c48-6708-4b3b-9cca-a6c82f4f253f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:22.498467501 +0000 UTC m=+66.268342648 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-29pzs" (UniqueName: "kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs") pod "network-check-target-b97qz" (UID: "37df9c48-6708-4b3b-9cca-a6c82f4f253f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:50.901641 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.901608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:50.901641 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:50.901658 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:50.902121 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.901754 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:50.902121 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.901754 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:50.902121 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.901825 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert podName:0e6ca43a-9f71-4557-be86-206743aee65b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.9018018 +0000 UTC m=+35.671676939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert") pod "ingress-canary-b4df6" (UID: "0e6ca43a-9f71-4557-be86-206743aee65b") : secret "canary-serving-cert" not found Apr 17 11:16:50.902121 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:50.901839 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls podName:7991569c-ec27-417a-8b37-b1129ca90932 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.901833052 +0000 UTC m=+35.671708188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls") pod "dns-default-mwrb7" (UID: "7991569c-ec27-417a-8b37-b1129ca90932") : secret "dns-default-metrics-tls" not found Apr 17 11:16:51.779352 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.779313 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:16:51.779586 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.779313 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:16:51.783655 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.783627 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:16:51.783655 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.783652 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:16:51.783847 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.783630 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dlffp\"" Apr 17 11:16:51.783847 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.783633 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q4j6t\"" Apr 17 11:16:51.783847 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.783631 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:16:51.908690 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.908652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:51.909198 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.908708 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:51.909198 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:51.908801 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:51.909198 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:51.908803 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:51.909198 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:51.908856 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert podName:0e6ca43a-9f71-4557-be86-206743aee65b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:53.908841385 +0000 UTC m=+37.678716521 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert") pod "ingress-canary-b4df6" (UID: "0e6ca43a-9f71-4557-be86-206743aee65b") : secret "canary-serving-cert" not found Apr 17 11:16:51.909198 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:51.908868 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls podName:7991569c-ec27-417a-8b37-b1129ca90932 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:53.908862477 +0000 UTC m=+37.678737613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls") pod "dns-default-mwrb7" (UID: "7991569c-ec27-417a-8b37-b1129ca90932") : secret "dns-default-metrics-tls" not found Apr 17 11:16:51.979186 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.979152 2580 generic.go:358] "Generic (PLEG): container finished" podID="f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d" containerID="c5b8c6313de0176bb2a6a4577444067d6219f03084ee6f52e7efc0601692e1ec" exitCode=0 Apr 17 11:16:51.979338 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:51.979198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" event={"ID":"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d","Type":"ContainerDied","Data":"c5b8c6313de0176bb2a6a4577444067d6219f03084ee6f52e7efc0601692e1ec"} Apr 17 11:16:52.720851 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.720817 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7"] Apr 17 11:16:52.749757 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.749724 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7"] Apr 17 11:16:52.749904 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.749836 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:52.753293 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.753264 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:16:52.753418 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.753291 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 11:16:52.754829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.754802 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:16:52.754957 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.754821 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:16:52.917908 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.917870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cb8b0150-d78f-47c4-8c14-3f44a773fc28-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:52.917908 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.917913 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb8b0150-d78f-47c4-8c14-3f44a773fc28-tmp\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:52.918387 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.918026 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj9l5\" (UniqueName: \"kubernetes.io/projected/cb8b0150-d78f-47c4-8c14-3f44a773fc28-kube-api-access-xj9l5\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:52.983805 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.983722 2580 generic.go:358] "Generic (PLEG): container finished" podID="f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d" containerID="94afbd824e35d1b196bcd563b8175dd3621173d2314c2607aa722fc744e4f387" exitCode=0 Apr 17 11:16:52.983946 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:52.983803 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" event={"ID":"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d","Type":"ContainerDied","Data":"94afbd824e35d1b196bcd563b8175dd3621173d2314c2607aa722fc744e4f387"} Apr 17 11:16:53.018569 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.018524 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cb8b0150-d78f-47c4-8c14-3f44a773fc28-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:53.018714 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.018597 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb8b0150-d78f-47c4-8c14-3f44a773fc28-tmp\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:53.018714 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.018665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xj9l5\" (UniqueName: \"kubernetes.io/projected/cb8b0150-d78f-47c4-8c14-3f44a773fc28-kube-api-access-xj9l5\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:53.019052 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.019028 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb8b0150-d78f-47c4-8c14-3f44a773fc28-tmp\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:53.027224 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.027197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cb8b0150-d78f-47c4-8c14-3f44a773fc28-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:53.029865 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.029839 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj9l5\" (UniqueName: \"kubernetes.io/projected/cb8b0150-d78f-47c4-8c14-3f44a773fc28-kube-api-access-xj9l5\") pod \"klusterlet-addon-workmgr-7f6d6797d5-4dlr7\" (UID: \"cb8b0150-d78f-47c4-8c14-3f44a773fc28\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:53.062093 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.062061 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:53.208962 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.208885 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7"] Apr 17 11:16:53.214166 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:16:53.214104 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb8b0150_d78f_47c4_8c14_3f44a773fc28.slice/crio-487894919d6d7c26096138f8df4c0ec148e89f6651de48cda0f3f4264c0dc554 WatchSource:0}: Error finding container 487894919d6d7c26096138f8df4c0ec148e89f6651de48cda0f3f4264c0dc554: Status 404 returned error can't find the container with id 487894919d6d7c26096138f8df4c0ec148e89f6651de48cda0f3f4264c0dc554 Apr 17 11:16:53.926674 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.926642 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:53.927254 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.926712 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:53.927254 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:53.926799 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:53.927254 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:53.926797 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:53.927254 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:53.926853 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls podName:7991569c-ec27-417a-8b37-b1129ca90932 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:57.926838422 +0000 UTC m=+41.696713558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls") pod "dns-default-mwrb7" (UID: "7991569c-ec27-417a-8b37-b1129ca90932") : secret "dns-default-metrics-tls" not found Apr 17 11:16:53.927254 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:53.926867 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert podName:0e6ca43a-9f71-4557-be86-206743aee65b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:57.926860772 +0000 UTC m=+41.696735907 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert") pod "ingress-canary-b4df6" (UID: "0e6ca43a-9f71-4557-be86-206743aee65b") : secret "canary-serving-cert" not found Apr 17 11:16:53.989093 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.989013 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" event={"ID":"f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d","Type":"ContainerStarted","Data":"0a48cf11ca3a1ba598ba73258266326d55791b448c3ce1200912ac16ad6ca78d"} Apr 17 11:16:53.990209 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:53.990183 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" event={"ID":"cb8b0150-d78f-47c4-8c14-3f44a773fc28","Type":"ContainerStarted","Data":"487894919d6d7c26096138f8df4c0ec148e89f6651de48cda0f3f4264c0dc554"} Apr 17 11:16:54.014659 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:54.014601 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fdfx7" podStartSLOduration=6.362891698 podStartE2EDuration="38.01458178s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.281582135 +0000 UTC m=+3.051457271" lastFinishedPulling="2026-04-17 11:16:50.933272214 +0000 UTC m=+34.703147353" observedRunningTime="2026-04-17 11:16:54.012424598 +0000 UTC m=+37.782299757" watchObservedRunningTime="2026-04-17 11:16:54.01458178 +0000 UTC m=+37.784456940" Apr 17 11:16:57.963513 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:57.963459 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:16:57.964042 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:57.963586 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:16:57.964042 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:57.963626 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:57.964042 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:57.963692 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:57.964042 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:57.963729 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert podName:0e6ca43a-9f71-4557-be86-206743aee65b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:05.963707191 +0000 UTC m=+49.733582339 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert") pod "ingress-canary-b4df6" (UID: "0e6ca43a-9f71-4557-be86-206743aee65b") : secret "canary-serving-cert" not found Apr 17 11:16:57.964042 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:16:57.963752 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls podName:7991569c-ec27-417a-8b37-b1129ca90932 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:05.963736985 +0000 UTC m=+49.733612135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls") pod "dns-default-mwrb7" (UID: "7991569c-ec27-417a-8b37-b1129ca90932") : secret "dns-default-metrics-tls" not found Apr 17 11:16:57.999247 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:57.999208 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" event={"ID":"cb8b0150-d78f-47c4-8c14-3f44a773fc28","Type":"ContainerStarted","Data":"dd039347b7a44e48aff58e42b0dbecc44e1b20c141e0f5224107aa9c81b75dcd"} Apr 17 11:16:57.999460 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:57.999439 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:58.001204 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:58.001181 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:16:58.015741 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:16:58.015699 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" podStartSLOduration=1.8596293400000001 podStartE2EDuration="6.015686739s" podCreationTimestamp="2026-04-17 11:16:52 +0000 UTC" firstStartedPulling="2026-04-17 11:16:53.215966699 +0000 UTC m=+36.985841835" lastFinishedPulling="2026-04-17 11:16:57.372024097 +0000 UTC m=+41.141899234" observedRunningTime="2026-04-17 11:16:58.015305949 +0000 UTC m=+41.785181106" watchObservedRunningTime="2026-04-17 11:16:58.015686739 +0000 UTC m=+41.785561896" Apr 17 11:17:06.021702 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:06.021660 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:17:06.022197 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:06.021712 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:17:06.022197 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:06.021815 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:06.022197 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:06.021858 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:06.022197 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:06.021873 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls podName:7991569c-ec27-417a-8b37-b1129ca90932 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:22.021857724 +0000 UTC m=+65.791732864 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls") pod "dns-default-mwrb7" (UID: "7991569c-ec27-417a-8b37-b1129ca90932") : secret "dns-default-metrics-tls" not found Apr 17 11:17:06.022197 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:06.021898 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert podName:0e6ca43a-9f71-4557-be86-206743aee65b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:22.021887059 +0000 UTC m=+65.791762195 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert") pod "ingress-canary-b4df6" (UID: "0e6ca43a-9f71-4557-be86-206743aee65b") : secret "canary-serving-cert" not found Apr 17 11:17:15.976598 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:15.976568 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt59h" Apr 17 11:17:22.032571 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.032530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:17:22.032571 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.032580 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:17:22.033016 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:22.032675 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:22.033016 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:22.032679 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:22.033016 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:22.032731 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert podName:0e6ca43a-9f71-4557-be86-206743aee65b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:54.032716215 +0000 UTC m=+97.802591351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert") pod "ingress-canary-b4df6" (UID: "0e6ca43a-9f71-4557-be86-206743aee65b") : secret "canary-serving-cert" not found Apr 17 11:17:22.033016 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:22.032744 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls podName:7991569c-ec27-417a-8b37-b1129ca90932 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:54.032737607 +0000 UTC m=+97.802612743 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls") pod "dns-default-mwrb7" (UID: "7991569c-ec27-417a-8b37-b1129ca90932") : secret "dns-default-metrics-tls" not found Apr 17 11:17:22.434674 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.434637 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:17:22.437568 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.437552 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:22.445757 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:22.445740 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:22.445819 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:22.445813 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:18:26.445795588 +0000 UTC m=+130.215670725 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : secret "metrics-daemon-secret" not found Apr 17 11:17:22.535500 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.535457 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:17:22.538296 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.538275 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:22.548265 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.548241 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:22.559664 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.559644 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29pzs\" (UniqueName: \"kubernetes.io/projected/37df9c48-6708-4b3b-9cca-a6c82f4f253f-kube-api-access-29pzs\") pod \"network-check-target-b97qz\" (UID: \"37df9c48-6708-4b3b-9cca-a6c82f4f253f\") " pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:17:22.692344 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.692258 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dlffp\"" Apr 17 11:17:22.699696 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.699669 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:17:22.815069 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:22.815041 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b97qz"] Apr 17 11:17:22.818674 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:17:22.818650 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37df9c48_6708_4b3b_9cca_a6c82f4f253f.slice/crio-cc45aa096f43d41763f34d124ad70bc6ccb759918b58a81c78c56b6244f526e7 WatchSource:0}: Error finding container cc45aa096f43d41763f34d124ad70bc6ccb759918b58a81c78c56b6244f526e7: Status 404 returned error can't find the container with id cc45aa096f43d41763f34d124ad70bc6ccb759918b58a81c78c56b6244f526e7 Apr 17 11:17:23.045254 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:23.045169 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b97qz" event={"ID":"37df9c48-6708-4b3b-9cca-a6c82f4f253f","Type":"ContainerStarted","Data":"cc45aa096f43d41763f34d124ad70bc6ccb759918b58a81c78c56b6244f526e7"} Apr 17 11:17:26.052063 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:26.052025 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b97qz" event={"ID":"37df9c48-6708-4b3b-9cca-a6c82f4f253f","Type":"ContainerStarted","Data":"fc5fbadc203a96a3d5d3a7e8ab5192d20d0fd09dbd5a97f6608b149c5dab17ea"} Apr 17 11:17:26.052462 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:26.052188 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:17:26.068493 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:26.068445 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b97qz" podStartSLOduration=67.405519078 podStartE2EDuration="1m10.068431566s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:17:22.820855874 +0000 UTC m=+66.590731010" lastFinishedPulling="2026-04-17 11:17:25.483768342 +0000 UTC m=+69.253643498" observedRunningTime="2026-04-17 11:17:26.067772666 +0000 UTC m=+69.837647825" watchObservedRunningTime="2026-04-17 11:17:26.068431566 +0000 UTC m=+69.838306767" Apr 17 11:17:54.050009 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:54.049917 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:17:54.050009 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:54.050008 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:17:54.050486 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:54.050068 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:54.050486 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:54.050102 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:54.050486 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:54.050167 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert podName:0e6ca43a-9f71-4557-be86-206743aee65b nodeName:}" failed. No retries permitted until 2026-04-17 11:18:58.050128531 +0000 UTC m=+161.820003666 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert") pod "ingress-canary-b4df6" (UID: "0e6ca43a-9f71-4557-be86-206743aee65b") : secret "canary-serving-cert" not found Apr 17 11:17:54.050486 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:17:54.050182 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls podName:7991569c-ec27-417a-8b37-b1129ca90932 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:58.050175943 +0000 UTC m=+161.820051079 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls") pod "dns-default-mwrb7" (UID: "7991569c-ec27-417a-8b37-b1129ca90932") : secret "dns-default-metrics-tls" not found Apr 17 11:17:57.056267 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:17:57.056231 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b97qz" Apr 17 11:18:26.466890 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:26.466826 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:18:26.467425 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:26.466977 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:26.467425 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:26.467058 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs podName:71c35dce-5b27-4704-95a2-e390345991dc nodeName:}" failed. No retries permitted until 2026-04-17 11:20:28.467041429 +0000 UTC m=+252.236916564 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs") pod "network-metrics-daemon-s9wws" (UID: "71c35dce-5b27-4704-95a2-e390345991dc") : secret "metrics-daemon-secret" not found Apr 17 11:18:28.976971 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:28.976940 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp"] Apr 17 11:18:28.979559 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:28.979543 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp" Apr 17 11:18:28.982648 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:28.982628 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-m89dt\"" Apr 17 11:18:28.997150 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:28.997110 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp"] Apr 17 11:18:29.085039 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.084993 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5gcm\" (UniqueName: \"kubernetes.io/projected/10a16b0d-5319-46a0-8c7e-2b4e12d48031-kube-api-access-f5gcm\") pod \"network-check-source-8894fc9bd-2tfhp\" (UID: \"10a16b0d-5319-46a0-8c7e-2b4e12d48031\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp" Apr 17 11:18:29.185828 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.185794 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5gcm\" (UniqueName: \"kubernetes.io/projected/10a16b0d-5319-46a0-8c7e-2b4e12d48031-kube-api-access-f5gcm\") pod \"network-check-source-8894fc9bd-2tfhp\" (UID: \"10a16b0d-5319-46a0-8c7e-2b4e12d48031\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp" Apr 17 11:18:29.197671 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.197637 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ch5s"] Apr 17 11:18:29.200686 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.200662 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8645cf9688-7dk9d"] Apr 17 11:18:29.200837 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.200820 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.204983 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.204949 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5gcm\" (UniqueName: \"kubernetes.io/projected/10a16b0d-5319-46a0-8c7e-2b4e12d48031-kube-api-access-f5gcm\") pod \"network-check-source-8894fc9bd-2tfhp\" (UID: \"10a16b0d-5319-46a0-8c7e-2b4e12d48031\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp" Apr 17 11:18:29.205118 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.205104 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 11:18:29.205203 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.205160 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:29.205259 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.205212 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-g9gdf\"" Apr 17 11:18:29.205308 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.205110 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 11:18:29.206495 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.206473 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4qnqt"] Apr 17 11:18:29.206638 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.206621 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.209213 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.209193 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6"] Apr 17 11:18:29.209371 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.209353 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.212002 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.211981 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp"] Apr 17 11:18:29.212145 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.212114 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.214549 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.214518 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:18:29.214549 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.214536 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:18:29.214713 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.214628 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:18:29.214713 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.214665 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:29.214816 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.214774 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:18:29.215063 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.215047 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 11:18:29.215127 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.215055 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 11:18:29.215210 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.215150 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:18:29.215897 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.215876 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-l4t4t\"" Apr 17 11:18:29.216005 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.215967 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 11:18:29.216005 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.215992 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 11:18:29.216126 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.215992 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:29.216732 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.216538 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-8pnb8\"" Apr 17 11:18:29.216732 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.216624 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-vtrcz\"" Apr 17 11:18:29.216732 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.216698 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:29.219012 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.218976 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 11:18:29.219405 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.219386 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4qnqt"] Apr 17 11:18:29.221407 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.221232 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:29.221407 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.221354 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-9qx76\"" Apr 17 11:18:29.222448 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.222429 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 11:18:29.224841 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.224825 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:18:29.225173 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.225153 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp"] Apr 17 11:18:29.229059 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.228968 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ch5s"] Apr 17 11:18:29.230327 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.230307 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6"] Apr 17 11:18:29.231621 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.231591 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 11:18:29.231716 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.231595 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 11:18:29.253560 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.253534 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8645cf9688-7dk9d"] Apr 17 11:18:29.286395 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286363 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e15de31-bb9b-4066-b6d3-3121da3283ed-service-ca-bundle\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.286540 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-bound-sa-token\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.286540 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286424 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-certificates\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.286540 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286521 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29717902-39f6-4c43-9cb6-a981d0f5b344-trusted-ca\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.286634 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286549 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-ca-trust-extracted\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.286634 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286579 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:29.286634 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286602 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457a4466-3a04-4448-93fa-458f79dfc2e7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.286634 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286619 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-installation-pull-secrets\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.286773 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286636 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e15de31-bb9b-4066-b6d3-3121da3283ed-tmp\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.286773 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286672 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/53f4c1d9-1f56-473f-9820-14038f70b6c5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:29.286773 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286702 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-trusted-ca\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.286773 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286729 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-image-registry-private-configuration\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.286773 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286755 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/457a4466-3a04-4448-93fa-458f79dfc2e7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.286773 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286772 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29717902-39f6-4c43-9cb6-a981d0f5b344-config\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.286948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286793 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29717902-39f6-4c43-9cb6-a981d0f5b344-serving-cert\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.286948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286827 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgmkv\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-kube-api-access-kgmkv\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.286948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286844 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.286948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286861 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgj5c\" (UniqueName: \"kubernetes.io/projected/4e15de31-bb9b-4066-b6d3-3121da3283ed-kube-api-access-bgj5c\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.286948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286897 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e15de31-bb9b-4066-b6d3-3121da3283ed-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.286948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286915 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hphq5\" (UniqueName: \"kubernetes.io/projected/457a4466-3a04-4448-93fa-458f79dfc2e7-kube-api-access-hphq5\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.286948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286940 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4e15de31-bb9b-4066-b6d3-3121da3283ed-snapshots\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.287201 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.286955 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e15de31-bb9b-4066-b6d3-3121da3283ed-serving-cert\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.287201 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.287019 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-586l6\" (UniqueName: \"kubernetes.io/projected/29717902-39f6-4c43-9cb6-a981d0f5b344-kube-api-access-586l6\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.288335 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.288322 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp" Apr 17 11:18:29.387632 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-certificates\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.387803 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29717902-39f6-4c43-9cb6-a981d0f5b344-trusted-ca\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.387803 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387706 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-ca-trust-extracted\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.387803 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:29.387803 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457a4466-3a04-4448-93fa-458f79dfc2e7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.387803 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-installation-pull-secrets\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.388050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387809 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e15de31-bb9b-4066-b6d3-3121da3283ed-tmp\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.388050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387836 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/53f4c1d9-1f56-473f-9820-14038f70b6c5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:29.388050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387874 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-trusted-ca\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.388050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387907 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-image-registry-private-configuration\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.388050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387937 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/457a4466-3a04-4448-93fa-458f79dfc2e7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.388050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.387963 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29717902-39f6-4c43-9cb6-a981d0f5b344-config\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.388050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29717902-39f6-4c43-9cb6-a981d0f5b344-serving-cert\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.388050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388031 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgmkv\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-kube-api-access-kgmkv\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388065 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388097 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgj5c\" (UniqueName: \"kubernetes.io/projected/4e15de31-bb9b-4066-b6d3-3121da3283ed-kube-api-access-bgj5c\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388148 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e15de31-bb9b-4066-b6d3-3121da3283ed-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388177 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hphq5\" (UniqueName: \"kubernetes.io/projected/457a4466-3a04-4448-93fa-458f79dfc2e7-kube-api-access-hphq5\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388202 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4e15de31-bb9b-4066-b6d3-3121da3283ed-snapshots\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388227 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e15de31-bb9b-4066-b6d3-3121da3283ed-serving-cert\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388278 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-586l6\" (UniqueName: \"kubernetes.io/projected/29717902-39f6-4c43-9cb6-a981d0f5b344-kube-api-access-586l6\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388314 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e15de31-bb9b-4066-b6d3-3121da3283ed-service-ca-bundle\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388349 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-bound-sa-token\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388385 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-certificates\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.388452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/457a4466-3a04-4448-93fa-458f79dfc2e7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.388938 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.388689 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-ca-trust-extracted\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.389108 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e15de31-bb9b-4066-b6d3-3121da3283ed-tmp\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.389182 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-trusted-ca\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.389214 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/53f4c1d9-1f56-473f-9820-14038f70b6c5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.389727 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29717902-39f6-4c43-9cb6-a981d0f5b344-trusted-ca\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.389824 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.389900 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert podName:53f4c1d9-1f56-473f-9820-14038f70b6c5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:29.889881306 +0000 UTC m=+133.659756449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cnlnp" (UID: "53f4c1d9-1f56-473f-9820-14038f70b6c5") : secret "networking-console-plugin-cert" not found Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.389824 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4e15de31-bb9b-4066-b6d3-3121da3283ed-snapshots\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.390106 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.390120 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8645cf9688-7dk9d: secret "image-registry-tls" not found Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.390187 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls podName:73ee7579-8c67-47f1-84bd-e0c0f43b0e54 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:29.890170306 +0000 UTC m=+133.660045459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls") pod "image-registry-8645cf9688-7dk9d" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54") : secret "image-registry-tls" not found Apr 17 11:18:29.390360 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.390318 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29717902-39f6-4c43-9cb6-a981d0f5b344-config\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.391358 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.390957 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e15de31-bb9b-4066-b6d3-3121da3283ed-service-ca-bundle\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.391419 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.391354 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e15de31-bb9b-4066-b6d3-3121da3283ed-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.392272 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.392234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-installation-pull-secrets\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.392272 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.392266 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/457a4466-3a04-4448-93fa-458f79dfc2e7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.392454 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.392385 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29717902-39f6-4c43-9cb6-a981d0f5b344-serving-cert\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.392492 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.392455 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-image-registry-private-configuration\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.392540 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.392518 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e15de31-bb9b-4066-b6d3-3121da3283ed-serving-cert\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.397946 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.397923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-586l6\" (UniqueName: \"kubernetes.io/projected/29717902-39f6-4c43-9cb6-a981d0f5b344-kube-api-access-586l6\") pod \"console-operator-9d4b6777b-8ch5s\" (UID: \"29717902-39f6-4c43-9cb6-a981d0f5b344\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.398311 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.398287 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-bound-sa-token\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.398950 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.398925 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hphq5\" (UniqueName: \"kubernetes.io/projected/457a4466-3a04-4448-93fa-458f79dfc2e7-kube-api-access-hphq5\") pod \"kube-storage-version-migrator-operator-6769c5d45-zd4c6\" (UID: \"457a4466-3a04-4448-93fa-458f79dfc2e7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.399225 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.399127 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgj5c\" (UniqueName: \"kubernetes.io/projected/4e15de31-bb9b-4066-b6d3-3121da3283ed-kube-api-access-bgj5c\") pod \"insights-operator-585dfdc468-4qnqt\" (UID: \"4e15de31-bb9b-4066-b6d3-3121da3283ed\") " pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.399779 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.399752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgmkv\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-kube-api-access-kgmkv\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.415803 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.415776 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp"] Apr 17 11:18:29.419163 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:29.419128 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10a16b0d_5319_46a0_8c7e_2b4e12d48031.slice/crio-f23b0b19e316e9a3f43a7770ce62b5e9a7dc3bf3e003c5220efc06d360c3102d WatchSource:0}: Error finding container f23b0b19e316e9a3f43a7770ce62b5e9a7dc3bf3e003c5220efc06d360c3102d: Status 404 returned error can't find the container with id f23b0b19e316e9a3f43a7770ce62b5e9a7dc3bf3e003c5220efc06d360c3102d Apr 17 11:18:29.517439 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.517348 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:29.533400 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.533369 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-4qnqt" Apr 17 11:18:29.542197 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.542170 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" Apr 17 11:18:29.667635 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.667600 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ch5s"] Apr 17 11:18:29.672412 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:29.672376 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29717902_39f6_4c43_9cb6_a981d0f5b344.slice/crio-934a70590723b42120c69735334a5a8fcbbbdb4b611f4e106605bbdeb450dde1 WatchSource:0}: Error finding container 934a70590723b42120c69735334a5a8fcbbbdb4b611f4e106605bbdeb450dde1: Status 404 returned error can't find the container with id 934a70590723b42120c69735334a5a8fcbbbdb4b611f4e106605bbdeb450dde1 Apr 17 11:18:29.688474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.688438 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4qnqt"] Apr 17 11:18:29.692102 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:29.692064 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e15de31_bb9b_4066_b6d3_3121da3283ed.slice/crio-0a7273137cace11c43f06b12dd87bdfa6857df8b6eef6b27c67bf53545da463a WatchSource:0}: Error finding container 0a7273137cace11c43f06b12dd87bdfa6857df8b6eef6b27c67bf53545da463a: Status 404 returned error can't find the container with id 0a7273137cace11c43f06b12dd87bdfa6857df8b6eef6b27c67bf53545da463a Apr 17 11:18:29.700376 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.700351 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6"] Apr 17 11:18:29.703829 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:29.703802 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod457a4466_3a04_4448_93fa_458f79dfc2e7.slice/crio-958ffe9100a5f7cd5ab8551329c5780b42b69b81b4bf4be71927a69c492b34bd WatchSource:0}: Error finding container 958ffe9100a5f7cd5ab8551329c5780b42b69b81b4bf4be71927a69c492b34bd: Status 404 returned error can't find the container with id 958ffe9100a5f7cd5ab8551329c5780b42b69b81b4bf4be71927a69c492b34bd Apr 17 11:18:29.893761 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.893653 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:29.893761 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:29.893717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:29.894003 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.893819 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:29.894003 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.893899 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert podName:53f4c1d9-1f56-473f-9820-14038f70b6c5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:30.893878364 +0000 UTC m=+134.663753517 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cnlnp" (UID: "53f4c1d9-1f56-473f-9820-14038f70b6c5") : secret "networking-console-plugin-cert" not found Apr 17 11:18:29.894003 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.893822 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:29.894003 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.893927 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8645cf9688-7dk9d: secret "image-registry-tls" not found Apr 17 11:18:29.894003 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:29.893971 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls podName:73ee7579-8c67-47f1-84bd-e0c0f43b0e54 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:30.89395857 +0000 UTC m=+134.663833706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls") pod "image-registry-8645cf9688-7dk9d" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54") : secret "image-registry-tls" not found Apr 17 11:18:30.171334 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:30.171241 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" event={"ID":"457a4466-3a04-4448-93fa-458f79dfc2e7","Type":"ContainerStarted","Data":"958ffe9100a5f7cd5ab8551329c5780b42b69b81b4bf4be71927a69c492b34bd"} Apr 17 11:18:30.172469 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:30.172429 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" event={"ID":"29717902-39f6-4c43-9cb6-a981d0f5b344","Type":"ContainerStarted","Data":"934a70590723b42120c69735334a5a8fcbbbdb4b611f4e106605bbdeb450dde1"} Apr 17 11:18:30.173870 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:30.173837 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4qnqt" event={"ID":"4e15de31-bb9b-4066-b6d3-3121da3283ed","Type":"ContainerStarted","Data":"0a7273137cace11c43f06b12dd87bdfa6857df8b6eef6b27c67bf53545da463a"} Apr 17 11:18:30.175322 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:30.175294 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp" event={"ID":"10a16b0d-5319-46a0-8c7e-2b4e12d48031","Type":"ContainerStarted","Data":"2e31464d168d46eef6e4836a7aba1e799b59b54281db88c26dfcb02df250ff1b"} Apr 17 11:18:30.175322 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:30.175330 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp" event={"ID":"10a16b0d-5319-46a0-8c7e-2b4e12d48031","Type":"ContainerStarted","Data":"f23b0b19e316e9a3f43a7770ce62b5e9a7dc3bf3e003c5220efc06d360c3102d"} Apr 17 11:18:30.193543 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:30.193416 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2tfhp" podStartSLOduration=2.193382493 podStartE2EDuration="2.193382493s" podCreationTimestamp="2026-04-17 11:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:30.191962868 +0000 UTC m=+133.961838027" watchObservedRunningTime="2026-04-17 11:18:30.193382493 +0000 UTC m=+133.963257655" Apr 17 11:18:30.903963 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:30.903915 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:30.904194 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:30.904007 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:30.904194 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:30.904095 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:30.904194 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:30.904183 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:30.904371 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:30.904203 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert podName:53f4c1d9-1f56-473f-9820-14038f70b6c5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:32.90417866 +0000 UTC m=+136.674053796 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cnlnp" (UID: "53f4c1d9-1f56-473f-9820-14038f70b6c5") : secret "networking-console-plugin-cert" not found Apr 17 11:18:30.904371 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:30.904205 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8645cf9688-7dk9d: secret "image-registry-tls" not found Apr 17 11:18:30.904371 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:30.904263 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls podName:73ee7579-8c67-47f1-84bd-e0c0f43b0e54 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:32.904245958 +0000 UTC m=+136.674121108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls") pod "image-registry-8645cf9688-7dk9d" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54") : secret "image-registry-tls" not found Apr 17 11:18:32.924462 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:32.924356 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:32.924955 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:32.924470 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:32.924955 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:32.924565 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:32.924955 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:32.924591 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8645cf9688-7dk9d: secret "image-registry-tls" not found Apr 17 11:18:32.924955 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:32.924593 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:32.924955 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:32.924670 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls podName:73ee7579-8c67-47f1-84bd-e0c0f43b0e54 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:36.92464823 +0000 UTC m=+140.694523372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls") pod "image-registry-8645cf9688-7dk9d" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54") : secret "image-registry-tls" not found Apr 17 11:18:32.924955 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:32.924691 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert podName:53f4c1d9-1f56-473f-9820-14038f70b6c5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:36.924682197 +0000 UTC m=+140.694557338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cnlnp" (UID: "53f4c1d9-1f56-473f-9820-14038f70b6c5") : secret "networking-console-plugin-cert" not found Apr 17 11:18:33.184324 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:33.184274 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" event={"ID":"457a4466-3a04-4448-93fa-458f79dfc2e7","Type":"ContainerStarted","Data":"c5b4dbf1c8ab77ea46b6627e1b4b1454825e582bffefcb1219c327936055139f"} Apr 17 11:18:33.185892 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:33.185870 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/0.log" Apr 17 11:18:33.186012 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:33.185912 2580 generic.go:358] "Generic (PLEG): container finished" podID="29717902-39f6-4c43-9cb6-a981d0f5b344" containerID="a7228c39aec7e60c02cb60c9dc5f59dd2ee0714e6b55ee1867d0cd2ddfa1d4a5" exitCode=255 Apr 17 11:18:33.186080 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:33.185998 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" event={"ID":"29717902-39f6-4c43-9cb6-a981d0f5b344","Type":"ContainerDied","Data":"a7228c39aec7e60c02cb60c9dc5f59dd2ee0714e6b55ee1867d0cd2ddfa1d4a5"} Apr 17 11:18:33.186260 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:33.186238 2580 scope.go:117] "RemoveContainer" containerID="a7228c39aec7e60c02cb60c9dc5f59dd2ee0714e6b55ee1867d0cd2ddfa1d4a5" Apr 17 11:18:33.187431 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:33.187410 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4qnqt" event={"ID":"4e15de31-bb9b-4066-b6d3-3121da3283ed","Type":"ContainerStarted","Data":"821edf6f47b345710157c15e61c78cb98206471e730119ad8038d740a0d5be04"} Apr 17 11:18:33.204934 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:33.204890 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" podStartSLOduration=1.323605123 podStartE2EDuration="4.204874837s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="2026-04-17 11:18:29.705618476 +0000 UTC m=+133.475493611" lastFinishedPulling="2026-04-17 11:18:32.586888175 +0000 UTC m=+136.356763325" observedRunningTime="2026-04-17 11:18:33.20408264 +0000 UTC m=+136.973957793" watchObservedRunningTime="2026-04-17 11:18:33.204874837 +0000 UTC m=+136.974750008" Apr 17 11:18:33.248592 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:33.248526 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-4qnqt" podStartSLOduration=1.352895369 podStartE2EDuration="4.248507441s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="2026-04-17 11:18:29.693854942 +0000 UTC m=+133.463730079" lastFinishedPulling="2026-04-17 11:18:32.589467013 +0000 UTC m=+136.359342151" observedRunningTime="2026-04-17 11:18:33.247376211 +0000 UTC m=+137.017251369" watchObservedRunningTime="2026-04-17 11:18:33.248507441 +0000 UTC m=+137.018382602" Apr 17 11:18:34.191342 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.191313 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/1.log" Apr 17 11:18:34.191759 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.191717 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/0.log" Apr 17 11:18:34.191807 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.191755 2580 generic.go:358] "Generic (PLEG): container finished" podID="29717902-39f6-4c43-9cb6-a981d0f5b344" containerID="942da1e4aff38156d7e45928dea0a84465d6eb7dfd5daa3f54aeea0470726f12" exitCode=255 Apr 17 11:18:34.191872 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.191850 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" event={"ID":"29717902-39f6-4c43-9cb6-a981d0f5b344","Type":"ContainerDied","Data":"942da1e4aff38156d7e45928dea0a84465d6eb7dfd5daa3f54aeea0470726f12"} Apr 17 11:18:34.191903 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.191896 2580 scope.go:117] "RemoveContainer" containerID="a7228c39aec7e60c02cb60c9dc5f59dd2ee0714e6b55ee1867d0cd2ddfa1d4a5" Apr 17 11:18:34.192300 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.192284 2580 scope.go:117] "RemoveContainer" containerID="942da1e4aff38156d7e45928dea0a84465d6eb7dfd5daa3f54aeea0470726f12" Apr 17 11:18:34.192503 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:34.192478 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ch5s_openshift-console-operator(29717902-39f6-4c43-9cb6-a981d0f5b344)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" podUID="29717902-39f6-4c43-9cb6-a981d0f5b344" Apr 17 11:18:34.733096 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.733050 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nglkq"] Apr 17 11:18:34.737345 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.737320 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.741096 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.741075 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 11:18:34.743229 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.743199 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 11:18:34.743632 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.743431 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 11:18:34.743632 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.743473 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bng55\"" Apr 17 11:18:34.744405 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.744384 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 11:18:34.750844 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.750816 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nglkq"] Apr 17 11:18:34.842062 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.842020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/95cb9fb5-0263-427d-b7b1-9da06103da13-signing-cabundle\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.842264 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.842098 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/95cb9fb5-0263-427d-b7b1-9da06103da13-signing-key\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.842264 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.842154 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df8z4\" (UniqueName: \"kubernetes.io/projected/95cb9fb5-0263-427d-b7b1-9da06103da13-kube-api-access-df8z4\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.942742 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.942706 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/95cb9fb5-0263-427d-b7b1-9da06103da13-signing-cabundle\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.942860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.942775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/95cb9fb5-0263-427d-b7b1-9da06103da13-signing-key\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.942860 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.942804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df8z4\" (UniqueName: \"kubernetes.io/projected/95cb9fb5-0263-427d-b7b1-9da06103da13-kube-api-access-df8z4\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.943400 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.943372 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/95cb9fb5-0263-427d-b7b1-9da06103da13-signing-cabundle\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.945346 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.945322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/95cb9fb5-0263-427d-b7b1-9da06103da13-signing-key\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:34.955924 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:34.955900 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-df8z4\" (UniqueName: \"kubernetes.io/projected/95cb9fb5-0263-427d-b7b1-9da06103da13-kube-api-access-df8z4\") pod \"service-ca-865cb79987-nglkq\" (UID: \"95cb9fb5-0263-427d-b7b1-9da06103da13\") " pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:35.046651 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:35.046557 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nglkq" Apr 17 11:18:35.162148 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:35.162100 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nglkq"] Apr 17 11:18:35.165571 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:35.165542 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95cb9fb5_0263_427d_b7b1_9da06103da13.slice/crio-a5dde7494bbc586db8fe9e2b2db09a5a5014c3992487982c6e37343be257eca1 WatchSource:0}: Error finding container a5dde7494bbc586db8fe9e2b2db09a5a5014c3992487982c6e37343be257eca1: Status 404 returned error can't find the container with id a5dde7494bbc586db8fe9e2b2db09a5a5014c3992487982c6e37343be257eca1 Apr 17 11:18:35.195993 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:35.195971 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/1.log" Apr 17 11:18:35.196408 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:35.196390 2580 scope.go:117] "RemoveContainer" containerID="942da1e4aff38156d7e45928dea0a84465d6eb7dfd5daa3f54aeea0470726f12" Apr 17 11:18:35.196632 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:35.196607 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ch5s_openshift-console-operator(29717902-39f6-4c43-9cb6-a981d0f5b344)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" podUID="29717902-39f6-4c43-9cb6-a981d0f5b344" Apr 17 11:18:35.197097 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:35.197076 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nglkq" event={"ID":"95cb9fb5-0263-427d-b7b1-9da06103da13","Type":"ContainerStarted","Data":"a5dde7494bbc586db8fe9e2b2db09a5a5014c3992487982c6e37343be257eca1"} Apr 17 11:18:35.788429 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:35.788401 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kgg54_6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb/dns-node-resolver/0.log" Apr 17 11:18:36.588151 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:36.588106 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bfkcr_2604f0e7-0ee7-4d02-adf7-f046ecf35e36/node-ca/0.log" Apr 17 11:18:36.960987 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:36.960939 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:36.961217 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:36.961032 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:36.961217 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:36.961083 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:36.961217 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:36.961176 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert podName:53f4c1d9-1f56-473f-9820-14038f70b6c5 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.96115572 +0000 UTC m=+148.731030870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cnlnp" (UID: "53f4c1d9-1f56-473f-9820-14038f70b6c5") : secret "networking-console-plugin-cert" not found Apr 17 11:18:36.961393 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:36.961223 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:36.961393 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:36.961246 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8645cf9688-7dk9d: secret "image-registry-tls" not found Apr 17 11:18:36.961393 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:36.961300 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls podName:73ee7579-8c67-47f1-84bd-e0c0f43b0e54 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.961284423 +0000 UTC m=+148.731159572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls") pod "image-registry-8645cf9688-7dk9d" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54") : secret "image-registry-tls" not found Apr 17 11:18:37.204122 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:37.204082 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nglkq" event={"ID":"95cb9fb5-0263-427d-b7b1-9da06103da13","Type":"ContainerStarted","Data":"8cdfba32b11f63306ab262f570eb5477365d5bc79ccf7fa2cb3e04d73f36b983"} Apr 17 11:18:37.226185 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:37.226104 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-nglkq" podStartSLOduration=1.582918628 podStartE2EDuration="3.226089911s" podCreationTimestamp="2026-04-17 11:18:34 +0000 UTC" firstStartedPulling="2026-04-17 11:18:35.167478169 +0000 UTC m=+138.937353305" lastFinishedPulling="2026-04-17 11:18:36.810649452 +0000 UTC m=+140.580524588" observedRunningTime="2026-04-17 11:18:37.225977223 +0000 UTC m=+140.995852383" watchObservedRunningTime="2026-04-17 11:18:37.226089911 +0000 UTC m=+140.995965069" Apr 17 11:18:39.518223 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:39.518182 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:39.518223 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:39.518223 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:39.518775 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:39.518584 2580 scope.go:117] "RemoveContainer" containerID="942da1e4aff38156d7e45928dea0a84465d6eb7dfd5daa3f54aeea0470726f12" Apr 17 11:18:39.518775 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:39.518756 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ch5s_openshift-console-operator(29717902-39f6-4c43-9cb6-a981d0f5b344)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" podUID="29717902-39f6-4c43-9cb6-a981d0f5b344" Apr 17 11:18:45.034689 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:45.034642 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:45.035105 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:45.034750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:18:45.035105 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:45.034849 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:45.035105 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:45.035015 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert podName:53f4c1d9-1f56-473f-9820-14038f70b6c5 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:01.034893408 +0000 UTC m=+164.804768546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cnlnp" (UID: "53f4c1d9-1f56-473f-9820-14038f70b6c5") : secret "networking-console-plugin-cert" not found Apr 17 11:18:45.037452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:45.037426 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"image-registry-8645cf9688-7dk9d\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:45.126010 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:45.125961 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:45.273955 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:45.270354 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8645cf9688-7dk9d"] Apr 17 11:18:45.275530 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:45.275499 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ee7579_8c67_47f1_84bd_e0c0f43b0e54.slice/crio-f2ccd660e1f083e97b6b398e2747aae306941861a4bb4a38787dc5ccb7cb1d0e WatchSource:0}: Error finding container f2ccd660e1f083e97b6b398e2747aae306941861a4bb4a38787dc5ccb7cb1d0e: Status 404 returned error can't find the container with id f2ccd660e1f083e97b6b398e2747aae306941861a4bb4a38787dc5ccb7cb1d0e Apr 17 11:18:46.233448 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:46.233404 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" event={"ID":"73ee7579-8c67-47f1-84bd-e0c0f43b0e54","Type":"ContainerStarted","Data":"7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b"} Apr 17 11:18:46.233448 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:46.233447 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" event={"ID":"73ee7579-8c67-47f1-84bd-e0c0f43b0e54","Type":"ContainerStarted","Data":"f2ccd660e1f083e97b6b398e2747aae306941861a4bb4a38787dc5ccb7cb1d0e"} Apr 17 11:18:46.234012 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:46.233547 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:18:46.254839 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:46.254783 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" podStartSLOduration=17.254767549 podStartE2EDuration="17.254767549s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:46.253381991 +0000 UTC m=+150.023257149" watchObservedRunningTime="2026-04-17 11:18:46.254767549 +0000 UTC m=+150.024642706" Apr 17 11:18:53.217605 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:53.217537 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mwrb7" podUID="7991569c-ec27-417a-8b37-b1129ca90932" Apr 17 11:18:53.228759 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:53.228718 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b4df6" podUID="0e6ca43a-9f71-4557-be86-206743aee65b" Apr 17 11:18:53.250857 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:53.250829 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mwrb7" Apr 17 11:18:54.779337 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:54.779303 2580 scope.go:117] "RemoveContainer" containerID="942da1e4aff38156d7e45928dea0a84465d6eb7dfd5daa3f54aeea0470726f12" Apr 17 11:18:54.794259 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:18:54.794225 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-s9wws" podUID="71c35dce-5b27-4704-95a2-e390345991dc" Apr 17 11:18:55.258494 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.258467 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/1.log" Apr 17 11:18:55.258671 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.258539 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" event={"ID":"29717902-39f6-4c43-9cb6-a981d0f5b344","Type":"ContainerStarted","Data":"2819216e6278c39664d270fc1826089c7ada492f31b883c74a3ee82e7bc35966"} Apr 17 11:18:55.258827 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.258799 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:55.277307 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.277232 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" podStartSLOduration=23.368365604 podStartE2EDuration="26.277218326s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="2026-04-17 11:18:29.674615441 +0000 UTC m=+133.444490577" lastFinishedPulling="2026-04-17 11:18:32.583468163 +0000 UTC m=+136.353343299" observedRunningTime="2026-04-17 11:18:55.275821439 +0000 UTC m=+159.045696608" watchObservedRunningTime="2026-04-17 11:18:55.277218326 +0000 UTC m=+159.047093484" Apr 17 11:18:55.811449 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.811422 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ch5s" Apr 17 11:18:55.910805 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.910760 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8645cf9688-7dk9d"] Apr 17 11:18:55.967776 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.967736 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d4d9cc967-whvw8"] Apr 17 11:18:55.970590 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.970561 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:55.977500 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.977471 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-srvwb"] Apr 17 11:18:55.979813 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.979791 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:55.984519 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.984499 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:18:55.986406 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.986385 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rkd8r\"" Apr 17 11:18:55.986566 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.986390 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:18:55.994378 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:55.994356 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d4d9cc967-whvw8"] Apr 17 11:18:56.018254 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6lcl\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-kube-api-access-h6lcl\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.018415 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018306 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-registry-tls\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.018415 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2f73caf-8c19-40b8-a6b8-f066ed884db8-installation-pull-secrets\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.018506 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018425 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2f73caf-8c19-40b8-a6b8-f066ed884db8-registry-certificates\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.018506 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018449 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2f73caf-8c19-40b8-a6b8-f066ed884db8-trusted-ca\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.018506 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018486 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-data-volume\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.019829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018533 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.019829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018568 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d2f73caf-8c19-40b8-a6b8-f066ed884db8-image-registry-private-configuration\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.019829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018606 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2f73caf-8c19-40b8-a6b8-f066ed884db8-ca-trust-extracted\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.019829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018634 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-bound-sa-token\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.019829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-crio-socket\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.019829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018709 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt9ks\" (UniqueName: \"kubernetes.io/projected/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-kube-api-access-dt9ks\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.019829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.018800 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.019829 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.019768 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-srvwb"] Apr 17 11:18:56.120026 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.119915 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.120026 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.119973 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6lcl\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-kube-api-access-h6lcl\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.120026 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120015 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-registry-tls\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.120026 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120033 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2f73caf-8c19-40b8-a6b8-f066ed884db8-installation-pull-secrets\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120062 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2f73caf-8c19-40b8-a6b8-f066ed884db8-registry-certificates\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120091 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2f73caf-8c19-40b8-a6b8-f066ed884db8-trusted-ca\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120121 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-data-volume\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120177 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120208 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d2f73caf-8c19-40b8-a6b8-f066ed884db8-image-registry-private-configuration\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120244 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2f73caf-8c19-40b8-a6b8-f066ed884db8-ca-trust-extracted\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120280 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-bound-sa-token\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-crio-socket\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.120411 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.120335 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dt9ks\" (UniqueName: \"kubernetes.io/projected/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-kube-api-access-dt9ks\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.121392 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.121087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d2f73caf-8c19-40b8-a6b8-f066ed884db8-ca-trust-extracted\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.121392 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.121197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-crio-socket\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.121392 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.121234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d2f73caf-8c19-40b8-a6b8-f066ed884db8-registry-certificates\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.121392 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.121364 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2f73caf-8c19-40b8-a6b8-f066ed884db8-trusted-ca\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.121696 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.121491 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-data-volume\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.121756 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.121744 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.122870 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.122843 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.122981 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.122885 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d2f73caf-8c19-40b8-a6b8-f066ed884db8-installation-pull-secrets\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.122981 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.122904 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-registry-tls\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.123080 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.122981 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d2f73caf-8c19-40b8-a6b8-f066ed884db8-image-registry-private-configuration\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.131893 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.131868 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6lcl\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-kube-api-access-h6lcl\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.136417 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.136383 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt9ks\" (UniqueName: \"kubernetes.io/projected/7c2bbce7-232a-4fe0-bb21-1fc7feafdfff-kube-api-access-dt9ks\") pod \"insights-runtime-extractor-srvwb\" (UID: \"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff\") " pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.136532 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.136476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2f73caf-8c19-40b8-a6b8-f066ed884db8-bound-sa-token\") pod \"image-registry-d4d9cc967-whvw8\" (UID: \"d2f73caf-8c19-40b8-a6b8-f066ed884db8\") " pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.282242 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.282203 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:56.290056 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.290029 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-srvwb" Apr 17 11:18:56.451126 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.451068 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-srvwb"] Apr 17 11:18:56.452813 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:56.452789 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d4d9cc967-whvw8"] Apr 17 11:18:56.455368 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:56.455342 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c2bbce7_232a_4fe0_bb21_1fc7feafdfff.slice/crio-b44a43f529e4b1d15f558c2300ca8c50681abb5756c77d440e95674c53786464 WatchSource:0}: Error finding container b44a43f529e4b1d15f558c2300ca8c50681abb5756c77d440e95674c53786464: Status 404 returned error can't find the container with id b44a43f529e4b1d15f558c2300ca8c50681abb5756c77d440e95674c53786464 Apr 17 11:18:56.455892 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:56.455867 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f73caf_8c19_40b8_a6b8_f066ed884db8.slice/crio-e8fe4b9dc746abb34cf3fb7d29228b82b01ca1c076acbe3260fe9cb33ca1c7c4 WatchSource:0}: Error finding container e8fe4b9dc746abb34cf3fb7d29228b82b01ca1c076acbe3260fe9cb33ca1c7c4: Status 404 returned error can't find the container with id e8fe4b9dc746abb34cf3fb7d29228b82b01ca1c076acbe3260fe9cb33ca1c7c4 Apr 17 11:18:57.270192 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:57.270154 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-srvwb" event={"ID":"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff","Type":"ContainerStarted","Data":"bafba338ae09c133f9d0c68d0b44c6b46eda6a9ccc44fb1e7a0bf36603952695"} Apr 17 11:18:57.270582 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:57.270199 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-srvwb" event={"ID":"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff","Type":"ContainerStarted","Data":"a83af480d1341a749f819ef4ab4723cae395b79f90b14e5fe588b97a7b530c1a"} Apr 17 11:18:57.270582 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:57.270209 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-srvwb" event={"ID":"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff","Type":"ContainerStarted","Data":"b44a43f529e4b1d15f558c2300ca8c50681abb5756c77d440e95674c53786464"} Apr 17 11:18:57.271562 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:57.271535 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" event={"ID":"d2f73caf-8c19-40b8-a6b8-f066ed884db8","Type":"ContainerStarted","Data":"8758c751fc8ac1b426f80cb3e6e9dc8c49424a661e7b4dd82ca2dd84efc18771"} Apr 17 11:18:57.271659 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:57.271571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" event={"ID":"d2f73caf-8c19-40b8-a6b8-f066ed884db8","Type":"ContainerStarted","Data":"e8fe4b9dc746abb34cf3fb7d29228b82b01ca1c076acbe3260fe9cb33ca1c7c4"} Apr 17 11:18:57.296227 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:57.296112 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" podStartSLOduration=2.29609202 podStartE2EDuration="2.29609202s" podCreationTimestamp="2026-04-17 11:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:57.295331869 +0000 UTC m=+161.065207027" watchObservedRunningTime="2026-04-17 11:18:57.29609202 +0000 UTC m=+161.065967180" Apr 17 11:18:58.000086 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.000007 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" podUID="cb8b0150-d78f-47c4-8c14-3f44a773fc28" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/readyz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 17 11:18:58.136795 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.136752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:18:58.136982 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.136819 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:18:58.139901 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.139870 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7991569c-ec27-417a-8b37-b1129ca90932-metrics-tls\") pod \"dns-default-mwrb7\" (UID: \"7991569c-ec27-417a-8b37-b1129ca90932\") " pod="openshift-dns/dns-default-mwrb7" Apr 17 11:18:58.140018 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.139966 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e6ca43a-9f71-4557-be86-206743aee65b-cert\") pod \"ingress-canary-b4df6\" (UID: \"0e6ca43a-9f71-4557-be86-206743aee65b\") " pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:18:58.276106 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.276010 2580 generic.go:358] "Generic (PLEG): container finished" podID="cb8b0150-d78f-47c4-8c14-3f44a773fc28" containerID="dd039347b7a44e48aff58e42b0dbecc44e1b20c141e0f5224107aa9c81b75dcd" exitCode=1 Apr 17 11:18:58.276556 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.276091 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" event={"ID":"cb8b0150-d78f-47c4-8c14-3f44a773fc28","Type":"ContainerDied","Data":"dd039347b7a44e48aff58e42b0dbecc44e1b20c141e0f5224107aa9c81b75dcd"} Apr 17 11:18:58.276556 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.276320 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:18:58.276670 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.276606 2580 scope.go:117] "RemoveContainer" containerID="dd039347b7a44e48aff58e42b0dbecc44e1b20c141e0f5224107aa9c81b75dcd" Apr 17 11:18:58.354611 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.354575 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvwdc\"" Apr 17 11:18:58.362817 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.362785 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mwrb7" Apr 17 11:18:58.630725 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:58.630693 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mwrb7"] Apr 17 11:18:58.633349 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:18:58.633320 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7991569c_ec27_417a_8b37_b1129ca90932.slice/crio-5de2907b08d351dcd31a83e77709b8e88a6e0bc0e1859691d7fc89076171e300 WatchSource:0}: Error finding container 5de2907b08d351dcd31a83e77709b8e88a6e0bc0e1859691d7fc89076171e300: Status 404 returned error can't find the container with id 5de2907b08d351dcd31a83e77709b8e88a6e0bc0e1859691d7fc89076171e300 Apr 17 11:18:59.280690 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:59.280605 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mwrb7" event={"ID":"7991569c-ec27-417a-8b37-b1129ca90932","Type":"ContainerStarted","Data":"5de2907b08d351dcd31a83e77709b8e88a6e0bc0e1859691d7fc89076171e300"} Apr 17 11:18:59.282846 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:59.282816 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-srvwb" event={"ID":"7c2bbce7-232a-4fe0-bb21-1fc7feafdfff","Type":"ContainerStarted","Data":"e02896add980b12487141647fed9a12a90296b55aa3aae95893796b6213874a5"} Apr 17 11:18:59.284811 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:59.284780 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" event={"ID":"cb8b0150-d78f-47c4-8c14-3f44a773fc28","Type":"ContainerStarted","Data":"4d8a6eb788683cbae67ab210e96dfe9546a706c27bafcccb3dd021793389dfe2"} Apr 17 11:18:59.285337 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:59.285321 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:18:59.285925 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:59.285904 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f6d6797d5-4dlr7" Apr 17 11:18:59.304401 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:18:59.304341 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-srvwb" podStartSLOduration=2.291437393 podStartE2EDuration="4.304326503s" podCreationTimestamp="2026-04-17 11:18:55 +0000 UTC" firstStartedPulling="2026-04-17 11:18:56.515883507 +0000 UTC m=+160.285758644" lastFinishedPulling="2026-04-17 11:18:58.528772609 +0000 UTC m=+162.298647754" observedRunningTime="2026-04-17 11:18:59.302122436 +0000 UTC m=+163.071997607" watchObservedRunningTime="2026-04-17 11:18:59.304326503 +0000 UTC m=+163.074201660" Apr 17 11:19:00.288976 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:00.288931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mwrb7" event={"ID":"7991569c-ec27-417a-8b37-b1129ca90932","Type":"ContainerStarted","Data":"49761f9447b8c8ddc1355a1e23e1449b14918591024aef843f4c72f962c6314d"} Apr 17 11:19:00.289462 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:00.288984 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mwrb7" event={"ID":"7991569c-ec27-417a-8b37-b1129ca90932","Type":"ContainerStarted","Data":"730c1826a763289868ff98c1954fbbdca24477545809a8036ec5fe92caceff0e"} Apr 17 11:19:00.319397 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:00.319345 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mwrb7" podStartSLOduration=129.093071223 podStartE2EDuration="2m10.319328442s" podCreationTimestamp="2026-04-17 11:16:50 +0000 UTC" firstStartedPulling="2026-04-17 11:18:58.635119713 +0000 UTC m=+162.404994848" lastFinishedPulling="2026-04-17 11:18:59.861376931 +0000 UTC m=+163.631252067" observedRunningTime="2026-04-17 11:19:00.317319147 +0000 UTC m=+164.087194304" watchObservedRunningTime="2026-04-17 11:19:00.319328442 +0000 UTC m=+164.089203610" Apr 17 11:19:01.061861 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:01.061817 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:19:01.064597 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:01.064569 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/53f4c1d9-1f56-473f-9820-14038f70b6c5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cnlnp\" (UID: \"53f4c1d9-1f56-473f-9820-14038f70b6c5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:19:01.293624 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:01.293584 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mwrb7" Apr 17 11:19:01.345933 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:01.345841 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" Apr 17 11:19:01.466392 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:01.466356 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp"] Apr 17 11:19:01.469309 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:19:01.469282 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53f4c1d9_1f56_473f_9820_14038f70b6c5.slice/crio-67665c0a93d55239a2ba1c438b004441716dce321ce0db7a6530f628b4f91ac3 WatchSource:0}: Error finding container 67665c0a93d55239a2ba1c438b004441716dce321ce0db7a6530f628b4f91ac3: Status 404 returned error can't find the container with id 67665c0a93d55239a2ba1c438b004441716dce321ce0db7a6530f628b4f91ac3 Apr 17 11:19:02.298032 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:02.297992 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" event={"ID":"53f4c1d9-1f56-473f-9820-14038f70b6c5","Type":"ContainerStarted","Data":"67665c0a93d55239a2ba1c438b004441716dce321ce0db7a6530f628b4f91ac3"} Apr 17 11:19:03.302178 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:03.302144 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" event={"ID":"53f4c1d9-1f56-473f-9820-14038f70b6c5","Type":"ContainerStarted","Data":"828a6b52b5b68e58b217af9eb3d093376978a521344fae243bac9cfce36ebc8b"} Apr 17 11:19:03.319661 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:03.319615 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cnlnp" podStartSLOduration=32.971087134 podStartE2EDuration="34.319602074s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="2026-04-17 11:19:01.471244436 +0000 UTC m=+165.241119573" lastFinishedPulling="2026-04-17 11:19:02.819759362 +0000 UTC m=+166.589634513" observedRunningTime="2026-04-17 11:19:03.319025971 +0000 UTC m=+167.088901130" watchObservedRunningTime="2026-04-17 11:19:03.319602074 +0000 UTC m=+167.089477231" Apr 17 11:19:03.779262 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:03.779213 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:19:03.782706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:03.782686 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-whkkk\"" Apr 17 11:19:03.790170 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:03.790124 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b4df6" Apr 17 11:19:03.913785 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:03.913459 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b4df6"] Apr 17 11:19:03.916320 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:19:03.916282 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e6ca43a_9f71_4557_be86_206743aee65b.slice/crio-d104781bb6af0049eb66458217d4d1eea962307bae257fe5aaee53cc286b001c WatchSource:0}: Error finding container d104781bb6af0049eb66458217d4d1eea962307bae257fe5aaee53cc286b001c: Status 404 returned error can't find the container with id d104781bb6af0049eb66458217d4d1eea962307bae257fe5aaee53cc286b001c Apr 17 11:19:04.306338 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:04.306294 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b4df6" event={"ID":"0e6ca43a-9f71-4557-be86-206743aee65b","Type":"ContainerStarted","Data":"d104781bb6af0049eb66458217d4d1eea962307bae257fe5aaee53cc286b001c"} Apr 17 11:19:04.495965 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:04.495925 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72"] Apr 17 11:19:04.499129 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:04.499106 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" Apr 17 11:19:04.502427 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:04.502400 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 11:19:04.502980 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:04.502958 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-bgk7f\"" Apr 17 11:19:04.514500 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:04.514475 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72"] Apr 17 11:19:04.591363 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:04.591279 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4c7fd490-e851-41be-8fd5-544739a025da-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6xz72\" (UID: \"4c7fd490-e851-41be-8fd5-544739a025da\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" Apr 17 11:19:04.692747 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:04.692710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4c7fd490-e851-41be-8fd5-544739a025da-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6xz72\" (UID: \"4c7fd490-e851-41be-8fd5-544739a025da\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" Apr 17 11:19:04.692930 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:19:04.692874 2580 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 11:19:04.692979 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:19:04.692946 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c7fd490-e851-41be-8fd5-544739a025da-tls-certificates podName:4c7fd490-e851-41be-8fd5-544739a025da nodeName:}" failed. No retries permitted until 2026-04-17 11:19:05.192925227 +0000 UTC m=+168.962800368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/4c7fd490-e851-41be-8fd5-544739a025da-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-6xz72" (UID: "4c7fd490-e851-41be-8fd5-544739a025da") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 11:19:05.197186 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:05.197118 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4c7fd490-e851-41be-8fd5-544739a025da-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6xz72\" (UID: \"4c7fd490-e851-41be-8fd5-544739a025da\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" Apr 17 11:19:05.200186 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:05.200125 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4c7fd490-e851-41be-8fd5-544739a025da-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6xz72\" (UID: \"4c7fd490-e851-41be-8fd5-544739a025da\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" Apr 17 11:19:05.410726 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:05.410701 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" Apr 17 11:19:05.534385 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:05.534336 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72"] Apr 17 11:19:05.538935 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:19:05.538894 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c7fd490_e851_41be_8fd5_544739a025da.slice/crio-79f942e902f3c91330d7763664251665c8eecdb39e65fb1831efc18adfed9af9 WatchSource:0}: Error finding container 79f942e902f3c91330d7763664251665c8eecdb39e65fb1831efc18adfed9af9: Status 404 returned error can't find the container with id 79f942e902f3c91330d7763664251665c8eecdb39e65fb1831efc18adfed9af9 Apr 17 11:19:05.916958 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:05.916884 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:19:06.313773 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:06.313737 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" event={"ID":"4c7fd490-e851-41be-8fd5-544739a025da","Type":"ContainerStarted","Data":"79f942e902f3c91330d7763664251665c8eecdb39e65fb1831efc18adfed9af9"} Apr 17 11:19:06.315042 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:06.315017 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b4df6" event={"ID":"0e6ca43a-9f71-4557-be86-206743aee65b","Type":"ContainerStarted","Data":"4765d91f4456453223ebf9fefd9429f8eea30a73b3d4ca0fc44737cc84c79c18"} Apr 17 11:19:06.331109 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:06.331070 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b4df6" podStartSLOduration=134.851301867 podStartE2EDuration="2m16.331057169s" podCreationTimestamp="2026-04-17 11:16:50 +0000 UTC" firstStartedPulling="2026-04-17 11:19:03.918612095 +0000 UTC m=+167.688487232" lastFinishedPulling="2026-04-17 11:19:05.398367396 +0000 UTC m=+169.168242534" observedRunningTime="2026-04-17 11:19:06.330722887 +0000 UTC m=+170.100598036" watchObservedRunningTime="2026-04-17 11:19:06.331057169 +0000 UTC m=+170.100932327" Apr 17 11:19:08.321453 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:08.321410 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" event={"ID":"4c7fd490-e851-41be-8fd5-544739a025da","Type":"ContainerStarted","Data":"ec309fd77748fae6a3a8dc095259d6c68c8e8413adff6330f12680adc850aced"} Apr 17 11:19:08.321879 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:08.321631 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" Apr 17 11:19:08.326227 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:08.326198 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" Apr 17 11:19:08.336743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:08.336699 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xz72" podStartSLOduration=2.601045347 podStartE2EDuration="4.336686303s" podCreationTimestamp="2026-04-17 11:19:04 +0000 UTC" firstStartedPulling="2026-04-17 11:19:05.541019414 +0000 UTC m=+169.310894551" lastFinishedPulling="2026-04-17 11:19:07.276660372 +0000 UTC m=+171.046535507" observedRunningTime="2026-04-17 11:19:08.335827175 +0000 UTC m=+172.105702333" watchObservedRunningTime="2026-04-17 11:19:08.336686303 +0000 UTC m=+172.106561460" Apr 17 11:19:08.779272 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:08.779230 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:19:11.300882 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.300846 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mwrb7" Apr 17 11:19:11.925930 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.925895 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kz9q2"] Apr 17 11:19:11.929735 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.929711 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:11.932985 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.932762 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 11:19:11.932985 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.932852 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:19:11.932985 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.932902 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:19:11.932985 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.932945 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:19:11.934007 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.933969 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:19:11.934107 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.934028 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 11:19:11.934309 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.934292 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-l5ckv\"" Apr 17 11:19:11.942894 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.942872 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9qspg"] Apr 17 11:19:11.945942 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.945925 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:11.958155 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.958110 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kz9q2"] Apr 17 11:19:11.959826 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.959775 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dblbs\"" Apr 17 11:19:11.960676 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.960655 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:19:11.960791 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.960658 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:19:11.961049 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:11.961034 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:19:12.054774 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.054734 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.054940 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.054785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-tls\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.054940 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.054868 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de979cf4-c06f-45f0-bb82-7b6441f7d69e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.054940 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.054905 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-wtmp\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.055107 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.054948 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-accelerators-collector-config\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.055107 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.054976 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.055107 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.054998 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbwx\" (UniqueName: \"kubernetes.io/projected/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-api-access-wsbwx\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.055107 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.055031 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-sys\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.055107 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.055084 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.055278 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.055196 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-textfile\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.055278 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.055250 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/de979cf4-c06f-45f0-bb82-7b6441f7d69e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.055342 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.055286 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-root\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.055342 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.055313 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6m8w\" (UniqueName: \"kubernetes.io/projected/9c71ff3f-da65-4319-bd6a-2594634af03d-kube-api-access-x6m8w\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.055433 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.055351 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.055433 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.055382 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c71ff3f-da65-4319-bd6a-2594634af03d-metrics-client-ca\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.156599 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156556 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.156599 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c71ff3f-da65-4319-bd6a-2594634af03d-metrics-client-ca\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.156872 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156621 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.156872 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-tls\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.156872 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de979cf4-c06f-45f0-bb82-7b6441f7d69e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.156872 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-wtmp\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.156872 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-accelerators-collector-config\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157119 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156891 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157119 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156921 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsbwx\" (UniqueName: \"kubernetes.io/projected/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-api-access-wsbwx\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.157119 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-sys\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157119 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.156976 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.157119 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157029 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-textfile\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157119 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157039 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-wtmp\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157119 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157070 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/de979cf4-c06f-45f0-bb82-7b6441f7d69e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.157119 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-root\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157549 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157162 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6m8w\" (UniqueName: \"kubernetes.io/projected/9c71ff3f-da65-4319-bd6a-2594634af03d-kube-api-access-x6m8w\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157549 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157282 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c71ff3f-da65-4319-bd6a-2594634af03d-metrics-client-ca\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157549 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157339 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-sys\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157549 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157504 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.157549 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157531 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-textfile\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157853 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157581 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9c71ff3f-da65-4319-bd6a-2594634af03d-root\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.157853 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157651 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/de979cf4-c06f-45f0-bb82-7b6441f7d69e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.157971 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.157924 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-accelerators-collector-config\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.158307 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.158253 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de979cf4-c06f-45f0-bb82-7b6441f7d69e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.159797 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.159770 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-tls\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.159797 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.159784 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.160040 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.160024 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9c71ff3f-da65-4319-bd6a-2594634af03d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.160310 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.160290 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.165982 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.165961 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6m8w\" (UniqueName: \"kubernetes.io/projected/9c71ff3f-da65-4319-bd6a-2594634af03d-kube-api-access-x6m8w\") pod \"node-exporter-9qspg\" (UID: \"9c71ff3f-da65-4319-bd6a-2594634af03d\") " pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.166309 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.166289 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsbwx\" (UniqueName: \"kubernetes.io/projected/de979cf4-c06f-45f0-bb82-7b6441f7d69e-kube-api-access-wsbwx\") pod \"kube-state-metrics-69db897b98-kz9q2\" (UID: \"de979cf4-c06f-45f0-bb82-7b6441f7d69e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.242227 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.242195 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" Apr 17 11:19:12.255120 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.255088 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9qspg" Apr 17 11:19:12.264394 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:19:12.264357 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c71ff3f_da65_4319_bd6a_2594634af03d.slice/crio-aac377d993c11c33b00e3aa5b30ed3b87e506de92513dbd6b224cf4f63e853be WatchSource:0}: Error finding container aac377d993c11c33b00e3aa5b30ed3b87e506de92513dbd6b224cf4f63e853be: Status 404 returned error can't find the container with id aac377d993c11c33b00e3aa5b30ed3b87e506de92513dbd6b224cf4f63e853be Apr 17 11:19:12.335743 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.335702 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9qspg" event={"ID":"9c71ff3f-da65-4319-bd6a-2594634af03d","Type":"ContainerStarted","Data":"aac377d993c11c33b00e3aa5b30ed3b87e506de92513dbd6b224cf4f63e853be"} Apr 17 11:19:12.383843 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:12.383812 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kz9q2"] Apr 17 11:19:12.385821 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:19:12.385799 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde979cf4_c06f_45f0_bb82_7b6441f7d69e.slice/crio-35f3737b560eccb66b7ce038fb304ca3703a19622c44e4ca12b81a009596b400 WatchSource:0}: Error finding container 35f3737b560eccb66b7ce038fb304ca3703a19622c44e4ca12b81a009596b400: Status 404 returned error can't find the container with id 35f3737b560eccb66b7ce038fb304ca3703a19622c44e4ca12b81a009596b400 Apr 17 11:19:13.339912 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:13.339829 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9qspg" event={"ID":"9c71ff3f-da65-4319-bd6a-2594634af03d","Type":"ContainerStarted","Data":"70e535b6b456fa7f02f318856ce344db1b2e807c323c85ae6480bf76663b7a35"} Apr 17 11:19:13.340948 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:13.340908 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" event={"ID":"de979cf4-c06f-45f0-bb82-7b6441f7d69e","Type":"ContainerStarted","Data":"35f3737b560eccb66b7ce038fb304ca3703a19622c44e4ca12b81a009596b400"} Apr 17 11:19:14.347294 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.347257 2580 generic.go:358] "Generic (PLEG): container finished" podID="9c71ff3f-da65-4319-bd6a-2594634af03d" containerID="70e535b6b456fa7f02f318856ce344db1b2e807c323c85ae6480bf76663b7a35" exitCode=0 Apr 17 11:19:14.347722 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.347324 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9qspg" event={"ID":"9c71ff3f-da65-4319-bd6a-2594634af03d","Type":"ContainerDied","Data":"70e535b6b456fa7f02f318856ce344db1b2e807c323c85ae6480bf76663b7a35"} Apr 17 11:19:14.349249 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.349219 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" event={"ID":"de979cf4-c06f-45f0-bb82-7b6441f7d69e","Type":"ContainerStarted","Data":"d77da2530a2ead38fedc9cbebda062720fcef0ee36b37d5b1d336530b7e26856"} Apr 17 11:19:14.349342 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.349258 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" event={"ID":"de979cf4-c06f-45f0-bb82-7b6441f7d69e","Type":"ContainerStarted","Data":"a40dd53cdc1b785283e1a592f82a6439089a4d4dd3c3f7ebcf546ceedd3bb796"} Apr 17 11:19:14.349342 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.349272 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" event={"ID":"de979cf4-c06f-45f0-bb82-7b6441f7d69e","Type":"ContainerStarted","Data":"26be096c177c9cc7734e32b7bea4c2301beb25d1f99652a40937ce30f930f37c"} Apr 17 11:19:14.395261 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.395217 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-kz9q2" podStartSLOduration=2.147049701 podStartE2EDuration="3.39520008s" podCreationTimestamp="2026-04-17 11:19:11 +0000 UTC" firstStartedPulling="2026-04-17 11:19:12.387633906 +0000 UTC m=+176.157509042" lastFinishedPulling="2026-04-17 11:19:13.635784282 +0000 UTC m=+177.405659421" observedRunningTime="2026-04-17 11:19:14.393883872 +0000 UTC m=+178.163759031" watchObservedRunningTime="2026-04-17 11:19:14.39520008 +0000 UTC m=+178.165075237" Apr 17 11:19:14.987304 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.987268 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6465b67555-vz9g8"] Apr 17 11:19:14.991063 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.991044 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:14.994152 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.994111 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 11:19:14.994794 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.994769 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 11:19:14.994794 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.994781 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 11:19:14.994950 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.994852 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-f072j5fdqnvnv\"" Apr 17 11:19:14.995065 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.995038 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-2c8gg\"" Apr 17 11:19:14.995356 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.995331 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 11:19:14.995459 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:14.995383 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 11:19:15.003572 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.003551 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6465b67555-vz9g8"] Apr 17 11:19:15.083556 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.083514 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.083739 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.083575 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-tls\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.083739 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.083595 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.083739 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.083618 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8p6t\" (UniqueName: \"kubernetes.io/projected/9044080e-7489-4b71-a984-595c13b74fbf-kube-api-access-j8p6t\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.083739 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.083661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.083739 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.083686 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-grpc-tls\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.083739 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.083705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.083938 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.083747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9044080e-7489-4b71-a984-595c13b74fbf-metrics-client-ca\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.185007 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.184974 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.185194 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.185036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-tls\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.185281 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.185192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.185281 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.185248 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8p6t\" (UniqueName: \"kubernetes.io/projected/9044080e-7489-4b71-a984-595c13b74fbf-kube-api-access-j8p6t\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.185382 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.185298 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.185382 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.185350 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-grpc-tls\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.185477 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.185388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.185477 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.185419 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9044080e-7489-4b71-a984-595c13b74fbf-metrics-client-ca\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.186223 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.186197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9044080e-7489-4b71-a984-595c13b74fbf-metrics-client-ca\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.188045 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.188017 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.188527 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.188504 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.188707 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.188678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-tls\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.188707 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.188687 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.188969 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.188950 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.189228 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.189213 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9044080e-7489-4b71-a984-595c13b74fbf-secret-grpc-tls\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.196383 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.196358 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8p6t\" (UniqueName: \"kubernetes.io/projected/9044080e-7489-4b71-a984-595c13b74fbf-kube-api-access-j8p6t\") pod \"thanos-querier-6465b67555-vz9g8\" (UID: \"9044080e-7489-4b71-a984-595c13b74fbf\") " pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.301681 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.301591 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:15.357224 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.356691 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9qspg" event={"ID":"9c71ff3f-da65-4319-bd6a-2594634af03d","Type":"ContainerStarted","Data":"60df2997e8a25d2476c9d11952642384363d92f82d7e85b5d3983c3960ce609a"} Apr 17 11:19:15.357224 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.356746 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9qspg" event={"ID":"9c71ff3f-da65-4319-bd6a-2594634af03d","Type":"ContainerStarted","Data":"4bcef931d56c4dba72e6d5cc1e7def1a701d79681b063c9a1ed593cbdfff6dac"} Apr 17 11:19:15.378937 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.378877 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9qspg" podStartSLOduration=3.401628975 podStartE2EDuration="4.378861326s" podCreationTimestamp="2026-04-17 11:19:11 +0000 UTC" firstStartedPulling="2026-04-17 11:19:12.266125338 +0000 UTC m=+176.036000475" lastFinishedPulling="2026-04-17 11:19:13.243357691 +0000 UTC m=+177.013232826" observedRunningTime="2026-04-17 11:19:15.377454683 +0000 UTC m=+179.147329842" watchObservedRunningTime="2026-04-17 11:19:15.378861326 +0000 UTC m=+179.148736484" Apr 17 11:19:15.453712 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:15.453676 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6465b67555-vz9g8"] Apr 17 11:19:15.457733 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:19:15.457705 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9044080e_7489_4b71_a984_595c13b74fbf.slice/crio-965b4ec50456d4dff244b5096eaacc4c9e8b45392ac6df72accb8f01b79c8f2c WatchSource:0}: Error finding container 965b4ec50456d4dff244b5096eaacc4c9e8b45392ac6df72accb8f01b79c8f2c: Status 404 returned error can't find the container with id 965b4ec50456d4dff244b5096eaacc4c9e8b45392ac6df72accb8f01b79c8f2c Apr 17 11:19:16.359940 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:16.359899 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" event={"ID":"9044080e-7489-4b71-a984-595c13b74fbf","Type":"ContainerStarted","Data":"965b4ec50456d4dff244b5096eaacc4c9e8b45392ac6df72accb8f01b79c8f2c"} Apr 17 11:19:16.713047 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:16.713007 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm"] Apr 17 11:19:16.716537 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:16.716512 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" Apr 17 11:19:16.720055 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:16.720030 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 11:19:16.720180 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:16.720057 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-j88xk\"" Apr 17 11:19:16.727649 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:16.727626 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm"] Apr 17 11:19:16.800850 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:16.800823 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/90ea982f-1e36-4df2-8ffa-fd560a3bbe91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qhcnm\" (UID: \"90ea982f-1e36-4df2-8ffa-fd560a3bbe91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" Apr 17 11:19:16.901559 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:16.901520 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/90ea982f-1e36-4df2-8ffa-fd560a3bbe91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qhcnm\" (UID: \"90ea982f-1e36-4df2-8ffa-fd560a3bbe91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" Apr 17 11:19:16.901745 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:19:16.901710 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 11:19:16.901802 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:19:16.901794 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ea982f-1e36-4df2-8ffa-fd560a3bbe91-monitoring-plugin-cert podName:90ea982f-1e36-4df2-8ffa-fd560a3bbe91 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:17.401773199 +0000 UTC m=+181.171648340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/90ea982f-1e36-4df2-8ffa-fd560a3bbe91-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-qhcnm" (UID: "90ea982f-1e36-4df2-8ffa-fd560a3bbe91") : secret "monitoring-plugin-cert" not found Apr 17 11:19:17.405537 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:17.405503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/90ea982f-1e36-4df2-8ffa-fd560a3bbe91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qhcnm\" (UID: \"90ea982f-1e36-4df2-8ffa-fd560a3bbe91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" Apr 17 11:19:17.408559 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:17.408530 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/90ea982f-1e36-4df2-8ffa-fd560a3bbe91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qhcnm\" (UID: \"90ea982f-1e36-4df2-8ffa-fd560a3bbe91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" Apr 17 11:19:17.627607 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:17.627563 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" Apr 17 11:19:17.846566 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:17.846533 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm"] Apr 17 11:19:17.851387 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:19:17.851354 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90ea982f_1e36_4df2_8ffa_fd560a3bbe91.slice/crio-5e7120036fb4fdb7719cb8b37f928d9dfd072a3534cd160ad755ab515359193e WatchSource:0}: Error finding container 5e7120036fb4fdb7719cb8b37f928d9dfd072a3534cd160ad755ab515359193e: Status 404 returned error can't find the container with id 5e7120036fb4fdb7719cb8b37f928d9dfd072a3534cd160ad755ab515359193e Apr 17 11:19:18.368782 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:18.368701 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" event={"ID":"9044080e-7489-4b71-a984-595c13b74fbf","Type":"ContainerStarted","Data":"55932c10d9f9c736361b7d836f34c059319abb40dec7863e598ba8a3b26096d8"} Apr 17 11:19:18.368782 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:18.368753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" event={"ID":"9044080e-7489-4b71-a984-595c13b74fbf","Type":"ContainerStarted","Data":"1453b2d8a958610610ca2bc2f05ef92a5179139af728d1ec9ca49d2adec73af3"} Apr 17 11:19:18.368782 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:18.368769 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" event={"ID":"9044080e-7489-4b71-a984-595c13b74fbf","Type":"ContainerStarted","Data":"6502508ed2f1f0af687ad73440da733a5a006eb1e6f47ae8c87955eb367ba40c"} Apr 17 11:19:18.369847 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:18.369813 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" event={"ID":"90ea982f-1e36-4df2-8ffa-fd560a3bbe91","Type":"ContainerStarted","Data":"5e7120036fb4fdb7719cb8b37f928d9dfd072a3534cd160ad755ab515359193e"} Apr 17 11:19:19.289569 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.289545 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d4d9cc967-whvw8" Apr 17 11:19:19.375622 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.375534 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" event={"ID":"9044080e-7489-4b71-a984-595c13b74fbf","Type":"ContainerStarted","Data":"9a2443716abc1d5ef50eec323ab5735b070138d83dfd0695c2e249e08caaf4f8"} Apr 17 11:19:19.375622 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.375580 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" event={"ID":"9044080e-7489-4b71-a984-595c13b74fbf","Type":"ContainerStarted","Data":"bb9b6a39818f32cfed18acd345541cd359bf487e07f50bdd5c7bbcd6aff8128a"} Apr 17 11:19:19.375622 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.375595 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" event={"ID":"9044080e-7489-4b71-a984-595c13b74fbf","Type":"ContainerStarted","Data":"ef8567278340f6ade752f274af853562caa7950d53dccc1c799f6c2120364da3"} Apr 17 11:19:19.375855 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.375698 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:19.376892 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.376865 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" event={"ID":"90ea982f-1e36-4df2-8ffa-fd560a3bbe91","Type":"ContainerStarted","Data":"6d3bdf778de6d9ab7c88fbea55006c97fa2a46ca9c9a2c89390cdaefd149d097"} Apr 17 11:19:19.377095 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.377073 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" Apr 17 11:19:19.381753 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.381731 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" Apr 17 11:19:19.403558 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.403511 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" podStartSLOduration=1.7451027030000001 podStartE2EDuration="5.403497212s" podCreationTimestamp="2026-04-17 11:19:14 +0000 UTC" firstStartedPulling="2026-04-17 11:19:15.459487051 +0000 UTC m=+179.229362190" lastFinishedPulling="2026-04-17 11:19:19.11788156 +0000 UTC m=+182.887756699" observedRunningTime="2026-04-17 11:19:19.401935957 +0000 UTC m=+183.171811114" watchObservedRunningTime="2026-04-17 11:19:19.403497212 +0000 UTC m=+183.173372393" Apr 17 11:19:19.418514 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:19.418467 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qhcnm" podStartSLOduration=2.156006514 podStartE2EDuration="3.418454633s" podCreationTimestamp="2026-04-17 11:19:16 +0000 UTC" firstStartedPulling="2026-04-17 11:19:17.854088029 +0000 UTC m=+181.623963180" lastFinishedPulling="2026-04-17 11:19:19.116536159 +0000 UTC m=+182.886411299" observedRunningTime="2026-04-17 11:19:19.417877159 +0000 UTC m=+183.187752317" watchObservedRunningTime="2026-04-17 11:19:19.418454633 +0000 UTC m=+183.188329790" Apr 17 11:19:20.932725 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:20.932684 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" podUID="73ee7579-8c67-47f1-84bd-e0c0f43b0e54" containerName="registry" containerID="cri-o://7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b" gracePeriod=30 Apr 17 11:19:21.171319 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.171292 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:19:21.247325 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247244 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-image-registry-private-configuration\") pod \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " Apr 17 11:19:21.247325 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247281 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") pod \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " Apr 17 11:19:21.247325 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247311 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-certificates\") pod \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " Apr 17 11:19:21.247618 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247332 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-bound-sa-token\") pod \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " Apr 17 11:19:21.247618 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247357 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-ca-trust-extracted\") pod \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " Apr 17 11:19:21.247618 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247383 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-trusted-ca\") pod \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " Apr 17 11:19:21.247618 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247420 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgmkv\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-kube-api-access-kgmkv\") pod \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " Apr 17 11:19:21.247618 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247453 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-installation-pull-secrets\") pod \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\" (UID: \"73ee7579-8c67-47f1-84bd-e0c0f43b0e54\") " Apr 17 11:19:21.247855 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.247765 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "73ee7579-8c67-47f1-84bd-e0c0f43b0e54" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:21.248482 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.248446 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "73ee7579-8c67-47f1-84bd-e0c0f43b0e54" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:21.250067 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.250029 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "73ee7579-8c67-47f1-84bd-e0c0f43b0e54" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:21.250450 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.250335 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "73ee7579-8c67-47f1-84bd-e0c0f43b0e54" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:21.250560 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.250494 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-kube-api-access-kgmkv" (OuterVolumeSpecName: "kube-api-access-kgmkv") pod "73ee7579-8c67-47f1-84bd-e0c0f43b0e54" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54"). InnerVolumeSpecName "kube-api-access-kgmkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:21.250560 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.250523 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "73ee7579-8c67-47f1-84bd-e0c0f43b0e54" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:21.250639 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.250594 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "73ee7579-8c67-47f1-84bd-e0c0f43b0e54" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:21.255921 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.255895 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "73ee7579-8c67-47f1-84bd-e0c0f43b0e54" (UID: "73ee7579-8c67-47f1-84bd-e0c0f43b0e54"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:19:21.348170 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.348111 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-image-registry-private-configuration\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:19:21.348170 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.348166 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-tls\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:19:21.348170 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.348177 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-registry-certificates\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:19:21.348448 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.348188 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-bound-sa-token\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:19:21.348448 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.348198 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-ca-trust-extracted\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:19:21.348448 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.348207 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-trusted-ca\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:19:21.348448 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.348216 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kgmkv\" (UniqueName: \"kubernetes.io/projected/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-kube-api-access-kgmkv\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:19:21.348448 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.348233 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73ee7579-8c67-47f1-84bd-e0c0f43b0e54-installation-pull-secrets\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:19:21.384059 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.384027 2580 generic.go:358] "Generic (PLEG): container finished" podID="73ee7579-8c67-47f1-84bd-e0c0f43b0e54" containerID="7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b" exitCode=0 Apr 17 11:19:21.384250 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.384093 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" Apr 17 11:19:21.384250 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.384121 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" event={"ID":"73ee7579-8c67-47f1-84bd-e0c0f43b0e54","Type":"ContainerDied","Data":"7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b"} Apr 17 11:19:21.384250 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.384187 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8645cf9688-7dk9d" event={"ID":"73ee7579-8c67-47f1-84bd-e0c0f43b0e54","Type":"ContainerDied","Data":"f2ccd660e1f083e97b6b398e2747aae306941861a4bb4a38787dc5ccb7cb1d0e"} Apr 17 11:19:21.384250 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.384208 2580 scope.go:117] "RemoveContainer" containerID="7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b" Apr 17 11:19:21.396270 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.396247 2580 scope.go:117] "RemoveContainer" containerID="7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b" Apr 17 11:19:21.396566 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:19:21.396546 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b\": container with ID starting with 7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b not found: ID does not exist" containerID="7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b" Apr 17 11:19:21.396616 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.396576 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b"} err="failed to get container status \"7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b\": rpc error: code = NotFound desc = could not find container \"7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b\": container with ID starting with 7e1ab15fa379f499ec11dc170b46f4c1d9a070b39a07c60e44f5cd734266834b not found: ID does not exist" Apr 17 11:19:21.412389 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.412366 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8645cf9688-7dk9d"] Apr 17 11:19:21.415732 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:21.415707 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8645cf9688-7dk9d"] Apr 17 11:19:22.783558 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:22.783523 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ee7579-8c67-47f1-84bd-e0c0f43b0e54" path="/var/lib/kubelet/pods/73ee7579-8c67-47f1-84bd-e0c0f43b0e54/volumes" Apr 17 11:19:25.385798 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:25.385770 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6465b67555-vz9g8" Apr 17 11:19:48.460439 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:48.460408 2580 generic.go:358] "Generic (PLEG): container finished" podID="457a4466-3a04-4448-93fa-458f79dfc2e7" containerID="c5b4dbf1c8ab77ea46b6627e1b4b1454825e582bffefcb1219c327936055139f" exitCode=0 Apr 17 11:19:48.460859 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:48.460482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" event={"ID":"457a4466-3a04-4448-93fa-458f79dfc2e7","Type":"ContainerDied","Data":"c5b4dbf1c8ab77ea46b6627e1b4b1454825e582bffefcb1219c327936055139f"} Apr 17 11:19:48.460943 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:48.460926 2580 scope.go:117] "RemoveContainer" containerID="c5b4dbf1c8ab77ea46b6627e1b4b1454825e582bffefcb1219c327936055139f" Apr 17 11:19:49.464759 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:49.464718 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zd4c6" event={"ID":"457a4466-3a04-4448-93fa-458f79dfc2e7","Type":"ContainerStarted","Data":"154cadaff77887bc0e0d835861e0806878dd3b18b3ac2c749805e51cc61bf39b"} Apr 17 11:19:58.497759 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:58.497723 2580 generic.go:358] "Generic (PLEG): container finished" podID="4e15de31-bb9b-4066-b6d3-3121da3283ed" containerID="821edf6f47b345710157c15e61c78cb98206471e730119ad8038d740a0d5be04" exitCode=0 Apr 17 11:19:58.498157 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:58.497797 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4qnqt" event={"ID":"4e15de31-bb9b-4066-b6d3-3121da3283ed","Type":"ContainerDied","Data":"821edf6f47b345710157c15e61c78cb98206471e730119ad8038d740a0d5be04"} Apr 17 11:19:58.498207 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:58.498154 2580 scope.go:117] "RemoveContainer" containerID="821edf6f47b345710157c15e61c78cb98206471e730119ad8038d740a0d5be04" Apr 17 11:19:59.502496 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:19:59.502456 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4qnqt" event={"ID":"4e15de31-bb9b-4066-b6d3-3121da3283ed","Type":"ContainerStarted","Data":"a444df3cd049426bff64307f98050f72c80219bd3b360a0b04e8cd4c181f6fda"} Apr 17 11:20:28.500824 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:28.500767 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:20:28.503365 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:28.503342 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71c35dce-5b27-4704-95a2-e390345991dc-metrics-certs\") pod \"network-metrics-daemon-s9wws\" (UID: \"71c35dce-5b27-4704-95a2-e390345991dc\") " pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:20:28.583060 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:28.583021 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q4j6t\"" Apr 17 11:20:28.590397 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:28.590368 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9wws" Apr 17 11:20:28.713468 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:28.713438 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s9wws"] Apr 17 11:20:28.716590 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:20:28.716551 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c35dce_5b27_4704_95a2_e390345991dc.slice/crio-bf04f7a5903a94ea552bc02da81ed1e588ae53297637032d3584e0b4aa5c9723 WatchSource:0}: Error finding container bf04f7a5903a94ea552bc02da81ed1e588ae53297637032d3584e0b4aa5c9723: Status 404 returned error can't find the container with id bf04f7a5903a94ea552bc02da81ed1e588ae53297637032d3584e0b4aa5c9723 Apr 17 11:20:29.594050 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:29.593951 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s9wws" event={"ID":"71c35dce-5b27-4704-95a2-e390345991dc","Type":"ContainerStarted","Data":"bf04f7a5903a94ea552bc02da81ed1e588ae53297637032d3584e0b4aa5c9723"} Apr 17 11:20:30.598888 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:30.598850 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s9wws" event={"ID":"71c35dce-5b27-4704-95a2-e390345991dc","Type":"ContainerStarted","Data":"8c7c374b9d334e65ad7f14562737406da5eec4fb790a46ebceb3a0230b0af0d8"} Apr 17 11:20:30.598888 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:30.598886 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s9wws" event={"ID":"71c35dce-5b27-4704-95a2-e390345991dc","Type":"ContainerStarted","Data":"8cab2e4848eaea74760cc59e073d46415b0cad276810424453d5ea5dd061dd4d"} Apr 17 11:20:30.621338 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:20:30.621279 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-s9wws" podStartSLOduration=253.614510887 podStartE2EDuration="4m14.621262813s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:20:28.718627509 +0000 UTC m=+252.488502646" lastFinishedPulling="2026-04-17 11:20:29.725379434 +0000 UTC m=+253.495254572" observedRunningTime="2026-04-17 11:20:30.619451264 +0000 UTC m=+254.389326423" watchObservedRunningTime="2026-04-17 11:20:30.621262813 +0000 UTC m=+254.391137971" Apr 17 11:21:16.648603 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:16.648575 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/1.log" Apr 17 11:21:16.648603 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:16.648605 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/1.log" Apr 17 11:21:16.658698 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:16.658673 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:21:25.497996 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.497946 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6qj6k"] Apr 17 11:21:25.500367 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.498423 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73ee7579-8c67-47f1-84bd-e0c0f43b0e54" containerName="registry" Apr 17 11:21:25.500367 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.498439 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ee7579-8c67-47f1-84bd-e0c0f43b0e54" containerName="registry" Apr 17 11:21:25.500367 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.498496 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="73ee7579-8c67-47f1-84bd-e0c0f43b0e54" containerName="registry" Apr 17 11:21:25.501246 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.501226 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.504367 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.504341 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:21:25.508700 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.508676 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6qj6k"] Apr 17 11:21:25.542254 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.542212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eadbbb24-1dd1-4f83-ae66-0948bf92370e-dbus\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.542444 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.542266 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eadbbb24-1dd1-4f83-ae66-0948bf92370e-original-pull-secret\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.542444 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.542295 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eadbbb24-1dd1-4f83-ae66-0948bf92370e-kubelet-config\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.643046 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.643007 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eadbbb24-1dd1-4f83-ae66-0948bf92370e-dbus\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.643273 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.643064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eadbbb24-1dd1-4f83-ae66-0948bf92370e-original-pull-secret\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.643273 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.643095 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eadbbb24-1dd1-4f83-ae66-0948bf92370e-kubelet-config\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.643273 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.643231 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eadbbb24-1dd1-4f83-ae66-0948bf92370e-kubelet-config\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.643273 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.643246 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eadbbb24-1dd1-4f83-ae66-0948bf92370e-dbus\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.645695 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.645669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eadbbb24-1dd1-4f83-ae66-0948bf92370e-original-pull-secret\") pod \"global-pull-secret-syncer-6qj6k\" (UID: \"eadbbb24-1dd1-4f83-ae66-0948bf92370e\") " pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.811449 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.811353 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qj6k" Apr 17 11:21:25.940097 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.940062 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6qj6k"] Apr 17 11:21:25.943282 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:21:25.943249 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeadbbb24_1dd1_4f83_ae66_0948bf92370e.slice/crio-99b3dd7cb8e45ec1dc25c78a98276c078d59e0ec230562810fac0b84a23e7041 WatchSource:0}: Error finding container 99b3dd7cb8e45ec1dc25c78a98276c078d59e0ec230562810fac0b84a23e7041: Status 404 returned error can't find the container with id 99b3dd7cb8e45ec1dc25c78a98276c078d59e0ec230562810fac0b84a23e7041 Apr 17 11:21:25.945240 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:25.945216 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:21:26.759945 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:26.759900 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6qj6k" event={"ID":"eadbbb24-1dd1-4f83-ae66-0948bf92370e","Type":"ContainerStarted","Data":"99b3dd7cb8e45ec1dc25c78a98276c078d59e0ec230562810fac0b84a23e7041"} Apr 17 11:21:29.770628 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:29.770594 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6qj6k" event={"ID":"eadbbb24-1dd1-4f83-ae66-0948bf92370e","Type":"ContainerStarted","Data":"8ebf593a5030152d32a495a6236fa05c65c3d44f967d6cd44b19a2a4c07628dc"} Apr 17 11:21:29.786779 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:21:29.786724 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6qj6k" podStartSLOduration=1.087089444 podStartE2EDuration="4.786707527s" podCreationTimestamp="2026-04-17 11:21:25 +0000 UTC" firstStartedPulling="2026-04-17 11:21:25.94536711 +0000 UTC m=+309.715242246" lastFinishedPulling="2026-04-17 11:21:29.64498519 +0000 UTC m=+313.414860329" observedRunningTime="2026-04-17 11:21:29.78587938 +0000 UTC m=+313.555754540" watchObservedRunningTime="2026-04-17 11:21:29.786707527 +0000 UTC m=+313.556582686" Apr 17 11:22:25.541263 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.541220 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9"] Apr 17 11:22:25.544665 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.544646 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.547675 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.547654 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 11:22:25.548941 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.548921 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wjhnl\"" Apr 17 11:22:25.549055 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.548927 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 11:22:25.553303 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.553270 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9"] Apr 17 11:22:25.626124 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.626080 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgvtt\" (UniqueName: \"kubernetes.io/projected/634125ef-dc6a-4dd3-a5ff-166b274e8b54-kube-api-access-mgvtt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.626345 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.626162 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.626345 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.626196 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.726756 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.726713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgvtt\" (UniqueName: \"kubernetes.io/projected/634125ef-dc6a-4dd3-a5ff-166b274e8b54-kube-api-access-mgvtt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.726867 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.726769 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.726867 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.726797 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.727204 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.727188 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.727251 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.727238 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.735758 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.735730 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgvtt\" (UniqueName: \"kubernetes.io/projected/634125ef-dc6a-4dd3-a5ff-166b274e8b54-kube-api-access-mgvtt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.855307 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.855217 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:25.987914 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:25.987884 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9"] Apr 17 11:22:25.990623 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:22:25.990596 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod634125ef_dc6a_4dd3_a5ff_166b274e8b54.slice/crio-6437045dae1491e5c4e79d7e7252accb516efbdc6d20b376e5780231c35c8192 WatchSource:0}: Error finding container 6437045dae1491e5c4e79d7e7252accb516efbdc6d20b376e5780231c35c8192: Status 404 returned error can't find the container with id 6437045dae1491e5c4e79d7e7252accb516efbdc6d20b376e5780231c35c8192 Apr 17 11:22:26.952553 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:26.952501 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" event={"ID":"634125ef-dc6a-4dd3-a5ff-166b274e8b54","Type":"ContainerStarted","Data":"6437045dae1491e5c4e79d7e7252accb516efbdc6d20b376e5780231c35c8192"} Apr 17 11:22:30.971427 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:30.971386 2580 generic.go:358] "Generic (PLEG): container finished" podID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerID="45d2f6bb93f9c9eeaeaf8730407ffc73be0aadd8d7042efd67cf75a781596dbe" exitCode=0 Apr 17 11:22:30.971895 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:30.971476 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" event={"ID":"634125ef-dc6a-4dd3-a5ff-166b274e8b54","Type":"ContainerDied","Data":"45d2f6bb93f9c9eeaeaf8730407ffc73be0aadd8d7042efd67cf75a781596dbe"} Apr 17 11:22:32.980318 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:32.980286 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" event={"ID":"634125ef-dc6a-4dd3-a5ff-166b274e8b54","Type":"ContainerStarted","Data":"5514aad44e10a4075fc67329bc7a7be82f862409da7c0390f266576721c3ed44"} Apr 17 11:22:33.984637 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:33.984600 2580 generic.go:358] "Generic (PLEG): container finished" podID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerID="5514aad44e10a4075fc67329bc7a7be82f862409da7c0390f266576721c3ed44" exitCode=0 Apr 17 11:22:33.985001 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:33.984641 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" event={"ID":"634125ef-dc6a-4dd3-a5ff-166b274e8b54","Type":"ContainerDied","Data":"5514aad44e10a4075fc67329bc7a7be82f862409da7c0390f266576721c3ed44"} Apr 17 11:22:40.007303 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:40.007202 2580 generic.go:358] "Generic (PLEG): container finished" podID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerID="8e8603fc5c83cc18c403cb9998bc193f9bcbc27cac60c3048324ec28592bc9a2" exitCode=0 Apr 17 11:22:40.007303 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:40.007256 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" event={"ID":"634125ef-dc6a-4dd3-a5ff-166b274e8b54","Type":"ContainerDied","Data":"8e8603fc5c83cc18c403cb9998bc193f9bcbc27cac60c3048324ec28592bc9a2"} Apr 17 11:22:41.130198 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.130173 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:41.262575 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.262478 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-bundle\") pod \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " Apr 17 11:22:41.262575 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.262571 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgvtt\" (UniqueName: \"kubernetes.io/projected/634125ef-dc6a-4dd3-a5ff-166b274e8b54-kube-api-access-mgvtt\") pod \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " Apr 17 11:22:41.262762 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.262591 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-util\") pod \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\" (UID: \"634125ef-dc6a-4dd3-a5ff-166b274e8b54\") " Apr 17 11:22:41.263054 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.263026 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-bundle" (OuterVolumeSpecName: "bundle") pod "634125ef-dc6a-4dd3-a5ff-166b274e8b54" (UID: "634125ef-dc6a-4dd3-a5ff-166b274e8b54"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:22:41.265097 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.265060 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634125ef-dc6a-4dd3-a5ff-166b274e8b54-kube-api-access-mgvtt" (OuterVolumeSpecName: "kube-api-access-mgvtt") pod "634125ef-dc6a-4dd3-a5ff-166b274e8b54" (UID: "634125ef-dc6a-4dd3-a5ff-166b274e8b54"). InnerVolumeSpecName "kube-api-access-mgvtt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:22:41.266885 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.266865 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-util" (OuterVolumeSpecName: "util") pod "634125ef-dc6a-4dd3-a5ff-166b274e8b54" (UID: "634125ef-dc6a-4dd3-a5ff-166b274e8b54"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:22:41.364011 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.363974 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-bundle\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:22:41.364011 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.364006 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgvtt\" (UniqueName: \"kubernetes.io/projected/634125ef-dc6a-4dd3-a5ff-166b274e8b54-kube-api-access-mgvtt\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:22:41.364011 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:41.364017 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/634125ef-dc6a-4dd3-a5ff-166b274e8b54-util\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:22:42.014450 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:42.014425 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" Apr 17 11:22:42.014644 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:42.014422 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqz6m9" event={"ID":"634125ef-dc6a-4dd3-a5ff-166b274e8b54","Type":"ContainerDied","Data":"6437045dae1491e5c4e79d7e7252accb516efbdc6d20b376e5780231c35c8192"} Apr 17 11:22:42.014644 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:42.014531 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6437045dae1491e5c4e79d7e7252accb516efbdc6d20b376e5780231c35c8192" Apr 17 11:22:47.759271 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.759236 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk"] Apr 17 11:22:47.759706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.759533 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerName="extract" Apr 17 11:22:47.759706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.759544 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerName="extract" Apr 17 11:22:47.759706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.759558 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerName="pull" Apr 17 11:22:47.759706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.759564 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerName="pull" Apr 17 11:22:47.759706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.759577 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerName="util" Apr 17 11:22:47.759706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.759582 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerName="util" Apr 17 11:22:47.759706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.759630 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="634125ef-dc6a-4dd3-a5ff-166b274e8b54" containerName="extract" Apr 17 11:22:47.793989 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.793951 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk"] Apr 17 11:22:47.794163 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.794106 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:47.797112 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.797087 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 11:22:47.797261 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.797087 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-kp8qv\"" Apr 17 11:22:47.797402 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.797389 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 11:22:47.797487 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.797467 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 11:22:47.916203 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.916129 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/40ffaca4-32af-449f-ab36-425115742ba5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk\" (UID: \"40ffaca4-32af-449f-ab36-425115742ba5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:47.916381 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:47.916295 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b99rg\" (UniqueName: \"kubernetes.io/projected/40ffaca4-32af-449f-ab36-425115742ba5-kube-api-access-b99rg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk\" (UID: \"40ffaca4-32af-449f-ab36-425115742ba5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:48.017787 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:48.017688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b99rg\" (UniqueName: \"kubernetes.io/projected/40ffaca4-32af-449f-ab36-425115742ba5-kube-api-access-b99rg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk\" (UID: \"40ffaca4-32af-449f-ab36-425115742ba5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:48.017787 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:48.017743 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/40ffaca4-32af-449f-ab36-425115742ba5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk\" (UID: \"40ffaca4-32af-449f-ab36-425115742ba5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:48.020379 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:48.020351 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/40ffaca4-32af-449f-ab36-425115742ba5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk\" (UID: \"40ffaca4-32af-449f-ab36-425115742ba5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:48.026709 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:48.026685 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b99rg\" (UniqueName: \"kubernetes.io/projected/40ffaca4-32af-449f-ab36-425115742ba5-kube-api-access-b99rg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk\" (UID: \"40ffaca4-32af-449f-ab36-425115742ba5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:48.104768 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:48.104731 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:48.244548 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:48.244524 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk"] Apr 17 11:22:48.246809 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:22:48.246782 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40ffaca4_32af_449f_ab36_425115742ba5.slice/crio-32a4d5d594a50959bbdae1b0b2b9db33aa2cebf7f0b3b7bc97b8d512f19361df WatchSource:0}: Error finding container 32a4d5d594a50959bbdae1b0b2b9db33aa2cebf7f0b3b7bc97b8d512f19361df: Status 404 returned error can't find the container with id 32a4d5d594a50959bbdae1b0b2b9db33aa2cebf7f0b3b7bc97b8d512f19361df Apr 17 11:22:49.038799 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:49.038751 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" event={"ID":"40ffaca4-32af-449f-ab36-425115742ba5","Type":"ContainerStarted","Data":"32a4d5d594a50959bbdae1b0b2b9db33aa2cebf7f0b3b7bc97b8d512f19361df"} Apr 17 11:22:52.062359 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.062320 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" event={"ID":"40ffaca4-32af-449f-ab36-425115742ba5","Type":"ContainerStarted","Data":"2fbfe4bc428073b82bbf59e410ca9fa1022c8175529895af976460a8b62fad35"} Apr 17 11:22:52.062775 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.062469 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:22:52.085108 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.085048 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" podStartSLOduration=1.6702662240000001 podStartE2EDuration="5.085029557s" podCreationTimestamp="2026-04-17 11:22:47 +0000 UTC" firstStartedPulling="2026-04-17 11:22:48.248580182 +0000 UTC m=+392.018455317" lastFinishedPulling="2026-04-17 11:22:51.663343507 +0000 UTC m=+395.433218650" observedRunningTime="2026-04-17 11:22:52.08381745 +0000 UTC m=+395.853692608" watchObservedRunningTime="2026-04-17 11:22:52.085029557 +0000 UTC m=+395.854904712" Apr 17 11:22:52.558280 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.558244 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8"] Apr 17 11:22:52.577024 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.576992 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8"] Apr 17 11:22:52.577281 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.577212 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.580467 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.580409 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 11:22:52.580467 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.580451 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-flcr6\"" Apr 17 11:22:52.580652 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.580633 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 11:22:52.660584 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.660536 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5p4\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-kube-api-access-cf5p4\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.660797 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.660595 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.660797 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.660719 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b8699f01-3b1a-47f2-a377-96fb1ad11075-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.761559 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.761521 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5p4\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-kube-api-access-cf5p4\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.761559 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.761559 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.761790 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.761626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b8699f01-3b1a-47f2-a377-96fb1ad11075-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.761790 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:52.761713 2580 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:22:52.761790 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:52.761737 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:22:52.761790 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:52.761757 2580 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 11:22:52.761790 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:52.761777 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 11:22:52.761970 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:52.761842 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates podName:b8699f01-3b1a-47f2-a377-96fb1ad11075 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:53.261820613 +0000 UTC m=+397.031695765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates") pod "keda-metrics-apiserver-7c9f485588-xm2f8" (UID: "b8699f01-3b1a-47f2-a377-96fb1ad11075") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 11:22:52.761970 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.761950 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b8699f01-3b1a-47f2-a377-96fb1ad11075-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.773959 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.773929 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5p4\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-kube-api-access-cf5p4\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:52.850721 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.850624 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-vkwfr"] Apr 17 11:22:52.870524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.870492 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-vkwfr"] Apr 17 11:22:52.870706 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.870621 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:52.873239 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.873212 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 11:22:52.963392 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.963355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7ceca172-9d5a-42d0-b658-4cdea5b6d43f-certificates\") pod \"keda-admission-cf49989db-vkwfr\" (UID: \"7ceca172-9d5a-42d0-b658-4cdea5b6d43f\") " pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:52.963574 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:52.963451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572kd\" (UniqueName: \"kubernetes.io/projected/7ceca172-9d5a-42d0-b658-4cdea5b6d43f-kube-api-access-572kd\") pod \"keda-admission-cf49989db-vkwfr\" (UID: \"7ceca172-9d5a-42d0-b658-4cdea5b6d43f\") " pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:53.063849 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:53.063817 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-572kd\" (UniqueName: \"kubernetes.io/projected/7ceca172-9d5a-42d0-b658-4cdea5b6d43f-kube-api-access-572kd\") pod \"keda-admission-cf49989db-vkwfr\" (UID: \"7ceca172-9d5a-42d0-b658-4cdea5b6d43f\") " pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:53.064294 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:53.063887 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7ceca172-9d5a-42d0-b658-4cdea5b6d43f-certificates\") pod \"keda-admission-cf49989db-vkwfr\" (UID: \"7ceca172-9d5a-42d0-b658-4cdea5b6d43f\") " pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:53.066793 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:53.066766 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7ceca172-9d5a-42d0-b658-4cdea5b6d43f-certificates\") pod \"keda-admission-cf49989db-vkwfr\" (UID: \"7ceca172-9d5a-42d0-b658-4cdea5b6d43f\") " pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:53.073577 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:53.073548 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-572kd\" (UniqueName: \"kubernetes.io/projected/7ceca172-9d5a-42d0-b658-4cdea5b6d43f-kube-api-access-572kd\") pod \"keda-admission-cf49989db-vkwfr\" (UID: \"7ceca172-9d5a-42d0-b658-4cdea5b6d43f\") " pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:53.185028 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:53.185001 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:53.266449 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:53.266416 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:53.266718 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:53.266565 2580 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:22:53.266718 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:53.266586 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:22:53.266718 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:53.266608 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8: references non-existent secret key: tls.crt Apr 17 11:22:53.266718 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:53.266683 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates podName:b8699f01-3b1a-47f2-a377-96fb1ad11075 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:54.266662797 +0000 UTC m=+398.036537946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates") pod "keda-metrics-apiserver-7c9f485588-xm2f8" (UID: "b8699f01-3b1a-47f2-a377-96fb1ad11075") : references non-existent secret key: tls.crt Apr 17 11:22:53.314435 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:53.314405 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-vkwfr"] Apr 17 11:22:53.317451 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:22:53.317425 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ceca172_9d5a_42d0_b658_4cdea5b6d43f.slice/crio-faa93d5965d083b1f08a6652d175dc0731e7ecc3f56f5316a7e0967cadd9fa45 WatchSource:0}: Error finding container faa93d5965d083b1f08a6652d175dc0731e7ecc3f56f5316a7e0967cadd9fa45: Status 404 returned error can't find the container with id faa93d5965d083b1f08a6652d175dc0731e7ecc3f56f5316a7e0967cadd9fa45 Apr 17 11:22:54.071260 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:54.071156 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-vkwfr" event={"ID":"7ceca172-9d5a-42d0-b658-4cdea5b6d43f","Type":"ContainerStarted","Data":"faa93d5965d083b1f08a6652d175dc0731e7ecc3f56f5316a7e0967cadd9fa45"} Apr 17 11:22:54.275402 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:54.275360 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:54.275600 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:54.275555 2580 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:22:54.275600 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:54.275581 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:22:54.275701 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:54.275605 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8: references non-existent secret key: tls.crt Apr 17 11:22:54.275701 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:54.275673 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates podName:b8699f01-3b1a-47f2-a377-96fb1ad11075 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:56.275653061 +0000 UTC m=+400.045528230 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates") pod "keda-metrics-apiserver-7c9f485588-xm2f8" (UID: "b8699f01-3b1a-47f2-a377-96fb1ad11075") : references non-existent secret key: tls.crt Apr 17 11:22:55.076312 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:55.076269 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-vkwfr" event={"ID":"7ceca172-9d5a-42d0-b658-4cdea5b6d43f","Type":"ContainerStarted","Data":"97579adfef099aa03754448c4643a864789c4dd2668f175993df7bbdbe403a95"} Apr 17 11:22:55.076723 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:55.076420 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:22:55.092084 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:55.092024 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-vkwfr" podStartSLOduration=1.52528072 podStartE2EDuration="3.09200703s" podCreationTimestamp="2026-04-17 11:22:52 +0000 UTC" firstStartedPulling="2026-04-17 11:22:53.318769815 +0000 UTC m=+397.088644951" lastFinishedPulling="2026-04-17 11:22:54.885496111 +0000 UTC m=+398.655371261" observedRunningTime="2026-04-17 11:22:55.091747759 +0000 UTC m=+398.861622917" watchObservedRunningTime="2026-04-17 11:22:55.09200703 +0000 UTC m=+398.861882190" Apr 17 11:22:56.293393 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:22:56.293348 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:22:56.293893 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:56.293465 2580 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:22:56.293893 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:56.293481 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:22:56.293893 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:56.293504 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8: references non-existent secret key: tls.crt Apr 17 11:22:56.293893 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:22:56.293556 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates podName:b8699f01-3b1a-47f2-a377-96fb1ad11075 nodeName:}" failed. No retries permitted until 2026-04-17 11:23:00.293540747 +0000 UTC m=+404.063415883 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates") pod "keda-metrics-apiserver-7c9f485588-xm2f8" (UID: "b8699f01-3b1a-47f2-a377-96fb1ad11075") : references non-existent secret key: tls.crt Apr 17 11:23:00.329124 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:00.329069 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:23:00.331858 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:00.331834 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b8699f01-3b1a-47f2-a377-96fb1ad11075-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xm2f8\" (UID: \"b8699f01-3b1a-47f2-a377-96fb1ad11075\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:23:00.390649 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:00.390611 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:23:00.525092 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:00.520857 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8"] Apr 17 11:23:01.097717 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:01.097678 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" event={"ID":"b8699f01-3b1a-47f2-a377-96fb1ad11075","Type":"ContainerStarted","Data":"f5f5b04952e29e1710bce6632f0c36777d5cd94b675434db6d244bd0525abfac"} Apr 17 11:23:03.106582 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:03.106540 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" event={"ID":"b8699f01-3b1a-47f2-a377-96fb1ad11075","Type":"ContainerStarted","Data":"e48fd330f3a3dd79d7c7a60a267d00ae22c9c215fb4cef670513b01bb6aa70cc"} Apr 17 11:23:03.107082 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:03.106656 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:23:03.124091 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:03.124039 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" podStartSLOduration=8.97692136 podStartE2EDuration="11.124021855s" podCreationTimestamp="2026-04-17 11:22:52 +0000 UTC" firstStartedPulling="2026-04-17 11:23:00.525983748 +0000 UTC m=+404.295858898" lastFinishedPulling="2026-04-17 11:23:02.673084254 +0000 UTC m=+406.442959393" observedRunningTime="2026-04-17 11:23:03.123319405 +0000 UTC m=+406.893194565" watchObservedRunningTime="2026-04-17 11:23:03.124021855 +0000 UTC m=+406.893897013" Apr 17 11:23:13.069425 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:13.069391 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ztzxk" Apr 17 11:23:14.115783 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:14.115753 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xm2f8" Apr 17 11:23:16.081435 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:23:16.081406 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-vkwfr" Apr 17 11:24:01.410596 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.410557 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-8svzz"] Apr 17 11:24:01.414430 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.414412 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:01.417648 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.417624 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 11:24:01.417793 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.417689 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-cdpcn\"" Apr 17 11:24:01.417793 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.417708 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 11:24:01.418558 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.418536 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 11:24:01.425982 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.425958 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-8svzz"] Apr 17 11:24:01.443704 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.443674 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-s6x2f"] Apr 17 11:24:01.446920 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.446898 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:01.449603 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.449580 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-69gzf\"" Apr 17 11:24:01.449603 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.449592 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 11:24:01.455702 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.455661 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-s6x2f"] Apr 17 11:24:01.463474 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.463426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98a8ec98-b14e-4f7d-8ac3-701b47dccfef-data\") pod \"seaweedfs-86cc847c5c-s6x2f\" (UID: \"98a8ec98-b14e-4f7d-8ac3-701b47dccfef\") " pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:01.463588 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.463491 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lqqp\" (UniqueName: \"kubernetes.io/projected/98a8ec98-b14e-4f7d-8ac3-701b47dccfef-kube-api-access-2lqqp\") pod \"seaweedfs-86cc847c5c-s6x2f\" (UID: \"98a8ec98-b14e-4f7d-8ac3-701b47dccfef\") " pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:01.463588 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.463566 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b390940c-e1a7-4445-8fc3-14108d866e32-cert\") pod \"kserve-controller-manager-7dcb9f9f85-8svzz\" (UID: \"b390940c-e1a7-4445-8fc3-14108d866e32\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:01.463703 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.463609 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwgds\" (UniqueName: \"kubernetes.io/projected/b390940c-e1a7-4445-8fc3-14108d866e32-kube-api-access-qwgds\") pod \"kserve-controller-manager-7dcb9f9f85-8svzz\" (UID: \"b390940c-e1a7-4445-8fc3-14108d866e32\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:01.564621 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.564575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b390940c-e1a7-4445-8fc3-14108d866e32-cert\") pod \"kserve-controller-manager-7dcb9f9f85-8svzz\" (UID: \"b390940c-e1a7-4445-8fc3-14108d866e32\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:01.564621 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.564630 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwgds\" (UniqueName: \"kubernetes.io/projected/b390940c-e1a7-4445-8fc3-14108d866e32-kube-api-access-qwgds\") pod \"kserve-controller-manager-7dcb9f9f85-8svzz\" (UID: \"b390940c-e1a7-4445-8fc3-14108d866e32\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:01.564885 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.564655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98a8ec98-b14e-4f7d-8ac3-701b47dccfef-data\") pod \"seaweedfs-86cc847c5c-s6x2f\" (UID: \"98a8ec98-b14e-4f7d-8ac3-701b47dccfef\") " pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:01.564885 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.564682 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lqqp\" (UniqueName: \"kubernetes.io/projected/98a8ec98-b14e-4f7d-8ac3-701b47dccfef-kube-api-access-2lqqp\") pod \"seaweedfs-86cc847c5c-s6x2f\" (UID: \"98a8ec98-b14e-4f7d-8ac3-701b47dccfef\") " pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:01.565090 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.565055 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98a8ec98-b14e-4f7d-8ac3-701b47dccfef-data\") pod \"seaweedfs-86cc847c5c-s6x2f\" (UID: \"98a8ec98-b14e-4f7d-8ac3-701b47dccfef\") " pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:01.567295 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.567269 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b390940c-e1a7-4445-8fc3-14108d866e32-cert\") pod \"kserve-controller-manager-7dcb9f9f85-8svzz\" (UID: \"b390940c-e1a7-4445-8fc3-14108d866e32\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:01.577055 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.574491 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lqqp\" (UniqueName: \"kubernetes.io/projected/98a8ec98-b14e-4f7d-8ac3-701b47dccfef-kube-api-access-2lqqp\") pod \"seaweedfs-86cc847c5c-s6x2f\" (UID: \"98a8ec98-b14e-4f7d-8ac3-701b47dccfef\") " pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:01.580529 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.580502 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwgds\" (UniqueName: \"kubernetes.io/projected/b390940c-e1a7-4445-8fc3-14108d866e32-kube-api-access-qwgds\") pod \"kserve-controller-manager-7dcb9f9f85-8svzz\" (UID: \"b390940c-e1a7-4445-8fc3-14108d866e32\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:01.728245 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.728199 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:01.757564 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.757514 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:01.879819 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.879783 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-8svzz"] Apr 17 11:24:01.881005 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:24:01.880969 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb390940c_e1a7_4445_8fc3_14108d866e32.slice/crio-a71b55394843a962c5411278c1199617bcccd885c3a96ad3fb7f5327c27fa882 WatchSource:0}: Error finding container a71b55394843a962c5411278c1199617bcccd885c3a96ad3fb7f5327c27fa882: Status 404 returned error can't find the container with id a71b55394843a962c5411278c1199617bcccd885c3a96ad3fb7f5327c27fa882 Apr 17 11:24:01.907443 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:01.907421 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-s6x2f"] Apr 17 11:24:01.910066 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:24:01.910038 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98a8ec98_b14e_4f7d_8ac3_701b47dccfef.slice/crio-783292093b81b8a46eff99d9ad7285fb6f3a4c02c54f4c75d4cc8bcec09ef13a WatchSource:0}: Error finding container 783292093b81b8a46eff99d9ad7285fb6f3a4c02c54f4c75d4cc8bcec09ef13a: Status 404 returned error can't find the container with id 783292093b81b8a46eff99d9ad7285fb6f3a4c02c54f4c75d4cc8bcec09ef13a Apr 17 11:24:02.314385 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:02.314331 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-s6x2f" event={"ID":"98a8ec98-b14e-4f7d-8ac3-701b47dccfef","Type":"ContainerStarted","Data":"783292093b81b8a46eff99d9ad7285fb6f3a4c02c54f4c75d4cc8bcec09ef13a"} Apr 17 11:24:02.315634 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:02.315595 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" event={"ID":"b390940c-e1a7-4445-8fc3-14108d866e32","Type":"ContainerStarted","Data":"a71b55394843a962c5411278c1199617bcccd885c3a96ad3fb7f5327c27fa882"} Apr 17 11:24:06.333823 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:06.333783 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" event={"ID":"b390940c-e1a7-4445-8fc3-14108d866e32","Type":"ContainerStarted","Data":"05ea5b3b88101c4cceff3c16e5092518da7b59eba7efce24894fb3c3eb63ab8a"} Apr 17 11:24:06.334267 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:06.334066 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:06.335204 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:06.335176 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-s6x2f" event={"ID":"98a8ec98-b14e-4f7d-8ac3-701b47dccfef","Type":"ContainerStarted","Data":"c921fc5033613d4c0781406438587fc0188282e95e9bb8b2835af1ca4efacdb1"} Apr 17 11:24:06.335338 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:06.335264 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:06.351045 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:06.351000 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" podStartSLOduration=1.553884085 podStartE2EDuration="5.350986903s" podCreationTimestamp="2026-04-17 11:24:01 +0000 UTC" firstStartedPulling="2026-04-17 11:24:01.882345397 +0000 UTC m=+465.652220532" lastFinishedPulling="2026-04-17 11:24:05.679448214 +0000 UTC m=+469.449323350" observedRunningTime="2026-04-17 11:24:06.349568075 +0000 UTC m=+470.119443236" watchObservedRunningTime="2026-04-17 11:24:06.350986903 +0000 UTC m=+470.120862061" Apr 17 11:24:06.364746 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:06.364706 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-s6x2f" podStartSLOduration=1.539238907 podStartE2EDuration="5.364693657s" podCreationTimestamp="2026-04-17 11:24:01 +0000 UTC" firstStartedPulling="2026-04-17 11:24:01.911346651 +0000 UTC m=+465.681221787" lastFinishedPulling="2026-04-17 11:24:05.736801401 +0000 UTC m=+469.506676537" observedRunningTime="2026-04-17 11:24:06.364119367 +0000 UTC m=+470.133994525" watchObservedRunningTime="2026-04-17 11:24:06.364693657 +0000 UTC m=+470.134568818" Apr 17 11:24:12.341825 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:12.341796 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-s6x2f" Apr 17 11:24:37.294429 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.294395 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-8svzz"] Apr 17 11:24:37.294889 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.294621 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" podUID="b390940c-e1a7-4445-8fc3-14108d866e32" containerName="manager" containerID="cri-o://05ea5b3b88101c4cceff3c16e5092518da7b59eba7efce24894fb3c3eb63ab8a" gracePeriod=10 Apr 17 11:24:37.299858 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.299833 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:37.323770 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.323744 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-b2xvt"] Apr 17 11:24:37.326530 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.326515 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:37.336757 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.336734 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-b2xvt"] Apr 17 11:24:37.434655 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.434604 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10196e76-d24e-4b01-8ef5-bf65cddc8e3c-cert\") pod \"kserve-controller-manager-7dcb9f9f85-b2xvt\" (UID: \"10196e76-d24e-4b01-8ef5-bf65cddc8e3c\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:37.434844 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.434667 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw24w\" (UniqueName: \"kubernetes.io/projected/10196e76-d24e-4b01-8ef5-bf65cddc8e3c-kube-api-access-xw24w\") pod \"kserve-controller-manager-7dcb9f9f85-b2xvt\" (UID: \"10196e76-d24e-4b01-8ef5-bf65cddc8e3c\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:37.443024 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.442994 2580 generic.go:358] "Generic (PLEG): container finished" podID="b390940c-e1a7-4445-8fc3-14108d866e32" containerID="05ea5b3b88101c4cceff3c16e5092518da7b59eba7efce24894fb3c3eb63ab8a" exitCode=0 Apr 17 11:24:37.443180 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.443043 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" event={"ID":"b390940c-e1a7-4445-8fc3-14108d866e32","Type":"ContainerDied","Data":"05ea5b3b88101c4cceff3c16e5092518da7b59eba7efce24894fb3c3eb63ab8a"} Apr 17 11:24:37.525826 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.525805 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:37.535753 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.535728 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xw24w\" (UniqueName: \"kubernetes.io/projected/10196e76-d24e-4b01-8ef5-bf65cddc8e3c-kube-api-access-xw24w\") pod \"kserve-controller-manager-7dcb9f9f85-b2xvt\" (UID: \"10196e76-d24e-4b01-8ef5-bf65cddc8e3c\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:37.535841 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.535796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10196e76-d24e-4b01-8ef5-bf65cddc8e3c-cert\") pod \"kserve-controller-manager-7dcb9f9f85-b2xvt\" (UID: \"10196e76-d24e-4b01-8ef5-bf65cddc8e3c\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:37.538266 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.538250 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10196e76-d24e-4b01-8ef5-bf65cddc8e3c-cert\") pod \"kserve-controller-manager-7dcb9f9f85-b2xvt\" (UID: \"10196e76-d24e-4b01-8ef5-bf65cddc8e3c\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:37.547452 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.547368 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw24w\" (UniqueName: \"kubernetes.io/projected/10196e76-d24e-4b01-8ef5-bf65cddc8e3c-kube-api-access-xw24w\") pod \"kserve-controller-manager-7dcb9f9f85-b2xvt\" (UID: \"10196e76-d24e-4b01-8ef5-bf65cddc8e3c\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:37.636615 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.636578 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b390940c-e1a7-4445-8fc3-14108d866e32-cert\") pod \"b390940c-e1a7-4445-8fc3-14108d866e32\" (UID: \"b390940c-e1a7-4445-8fc3-14108d866e32\") " Apr 17 11:24:37.636783 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.636639 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwgds\" (UniqueName: \"kubernetes.io/projected/b390940c-e1a7-4445-8fc3-14108d866e32-kube-api-access-qwgds\") pod \"b390940c-e1a7-4445-8fc3-14108d866e32\" (UID: \"b390940c-e1a7-4445-8fc3-14108d866e32\") " Apr 17 11:24:37.639032 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.639001 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b390940c-e1a7-4445-8fc3-14108d866e32-kube-api-access-qwgds" (OuterVolumeSpecName: "kube-api-access-qwgds") pod "b390940c-e1a7-4445-8fc3-14108d866e32" (UID: "b390940c-e1a7-4445-8fc3-14108d866e32"). InnerVolumeSpecName "kube-api-access-qwgds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:24:37.639032 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.639007 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b390940c-e1a7-4445-8fc3-14108d866e32-cert" (OuterVolumeSpecName: "cert") pod "b390940c-e1a7-4445-8fc3-14108d866e32" (UID: "b390940c-e1a7-4445-8fc3-14108d866e32"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:24:37.689349 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.689312 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:37.738183 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.737836 2580 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b390940c-e1a7-4445-8fc3-14108d866e32-cert\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:24:37.738183 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.737889 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwgds\" (UniqueName: \"kubernetes.io/projected/b390940c-e1a7-4445-8fc3-14108d866e32-kube-api-access-qwgds\") on node \"ip-10-0-135-188.ec2.internal\" DevicePath \"\"" Apr 17 11:24:37.827556 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:37.827531 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-b2xvt"] Apr 17 11:24:37.830206 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:24:37.830174 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10196e76_d24e_4b01_8ef5_bf65cddc8e3c.slice/crio-f2d62fee8e44df89364081e1dd8e6357c19b4e7a40ecebc72684a4c30e2b2c91 WatchSource:0}: Error finding container f2d62fee8e44df89364081e1dd8e6357c19b4e7a40ecebc72684a4c30e2b2c91: Status 404 returned error can't find the container with id f2d62fee8e44df89364081e1dd8e6357c19b4e7a40ecebc72684a4c30e2b2c91 Apr 17 11:24:38.447859 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.447778 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" Apr 17 11:24:38.447859 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.447798 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-8svzz" event={"ID":"b390940c-e1a7-4445-8fc3-14108d866e32","Type":"ContainerDied","Data":"a71b55394843a962c5411278c1199617bcccd885c3a96ad3fb7f5327c27fa882"} Apr 17 11:24:38.447859 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.447847 2580 scope.go:117] "RemoveContainer" containerID="05ea5b3b88101c4cceff3c16e5092518da7b59eba7efce24894fb3c3eb63ab8a" Apr 17 11:24:38.449321 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.449295 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" event={"ID":"10196e76-d24e-4b01-8ef5-bf65cddc8e3c","Type":"ContainerStarted","Data":"e22a7bdc3d050b793ab809fddeac444d30c44b7e0c526192e0795f3b02985cd3"} Apr 17 11:24:38.449425 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.449328 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" event={"ID":"10196e76-d24e-4b01-8ef5-bf65cddc8e3c","Type":"ContainerStarted","Data":"f2d62fee8e44df89364081e1dd8e6357c19b4e7a40ecebc72684a4c30e2b2c91"} Apr 17 11:24:38.449425 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.449412 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:24:38.472961 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.472908 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" podStartSLOduration=1.115812169 podStartE2EDuration="1.472892297s" podCreationTimestamp="2026-04-17 11:24:37 +0000 UTC" firstStartedPulling="2026-04-17 11:24:37.831480941 +0000 UTC m=+501.601356077" lastFinishedPulling="2026-04-17 11:24:38.188561065 +0000 UTC m=+501.958436205" observedRunningTime="2026-04-17 11:24:38.471501136 +0000 UTC m=+502.241376293" watchObservedRunningTime="2026-04-17 11:24:38.472892297 +0000 UTC m=+502.242767456" Apr 17 11:24:38.487181 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.487146 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-8svzz"] Apr 17 11:24:38.493225 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.493201 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-8svzz"] Apr 17 11:24:38.784524 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:24:38.784447 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b390940c-e1a7-4445-8fc3-14108d866e32" path="/var/lib/kubelet/pods/b390940c-e1a7-4445-8fc3-14108d866e32/volumes" Apr 17 11:25:09.459871 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:09.459790 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-b2xvt" Apr 17 11:25:10.296151 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.296098 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-spbck"] Apr 17 11:25:10.296482 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.296469 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b390940c-e1a7-4445-8fc3-14108d866e32" containerName="manager" Apr 17 11:25:10.296532 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.296485 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b390940c-e1a7-4445-8fc3-14108d866e32" containerName="manager" Apr 17 11:25:10.296577 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.296567 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b390940c-e1a7-4445-8fc3-14108d866e32" containerName="manager" Apr 17 11:25:10.299645 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.299622 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:10.302443 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.302419 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 11:25:10.302616 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.302593 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-rvrz5\"" Apr 17 11:25:10.311088 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.311062 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-spbck"] Apr 17 11:25:10.318410 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.318383 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/19232515-f74d-45a3-b93f-4f513984904f-tls-certs\") pod \"model-serving-api-86f7b4b499-spbck\" (UID: \"19232515-f74d-45a3-b93f-4f513984904f\") " pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:10.318562 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.318536 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrwl4\" (UniqueName: \"kubernetes.io/projected/19232515-f74d-45a3-b93f-4f513984904f-kube-api-access-nrwl4\") pod \"model-serving-api-86f7b4b499-spbck\" (UID: \"19232515-f74d-45a3-b93f-4f513984904f\") " pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:10.419566 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.419525 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrwl4\" (UniqueName: \"kubernetes.io/projected/19232515-f74d-45a3-b93f-4f513984904f-kube-api-access-nrwl4\") pod \"model-serving-api-86f7b4b499-spbck\" (UID: \"19232515-f74d-45a3-b93f-4f513984904f\") " pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:10.419753 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.419587 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/19232515-f74d-45a3-b93f-4f513984904f-tls-certs\") pod \"model-serving-api-86f7b4b499-spbck\" (UID: \"19232515-f74d-45a3-b93f-4f513984904f\") " pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:10.419753 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:25:10.419717 2580 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 17 11:25:10.419838 ip-10-0-135-188 kubenswrapper[2580]: E0417 11:25:10.419778 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19232515-f74d-45a3-b93f-4f513984904f-tls-certs podName:19232515-f74d-45a3-b93f-4f513984904f nodeName:}" failed. No retries permitted until 2026-04-17 11:25:10.919760832 +0000 UTC m=+534.689635968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/19232515-f74d-45a3-b93f-4f513984904f-tls-certs") pod "model-serving-api-86f7b4b499-spbck" (UID: "19232515-f74d-45a3-b93f-4f513984904f") : secret "model-serving-api-tls" not found Apr 17 11:25:10.432051 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.432019 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrwl4\" (UniqueName: \"kubernetes.io/projected/19232515-f74d-45a3-b93f-4f513984904f-kube-api-access-nrwl4\") pod \"model-serving-api-86f7b4b499-spbck\" (UID: \"19232515-f74d-45a3-b93f-4f513984904f\") " pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:10.922988 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.922954 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/19232515-f74d-45a3-b93f-4f513984904f-tls-certs\") pod \"model-serving-api-86f7b4b499-spbck\" (UID: \"19232515-f74d-45a3-b93f-4f513984904f\") " pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:10.925571 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:10.925542 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/19232515-f74d-45a3-b93f-4f513984904f-tls-certs\") pod \"model-serving-api-86f7b4b499-spbck\" (UID: \"19232515-f74d-45a3-b93f-4f513984904f\") " pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:11.212381 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:11.212336 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:11.339263 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:11.339228 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-spbck"] Apr 17 11:25:11.343196 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:25:11.343158 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19232515_f74d_45a3_b93f_4f513984904f.slice/crio-998e24d06d1e0e8ce43833acccc72e3365af8840b24e4bb32d37679fc7750ceb WatchSource:0}: Error finding container 998e24d06d1e0e8ce43833acccc72e3365af8840b24e4bb32d37679fc7750ceb: Status 404 returned error can't find the container with id 998e24d06d1e0e8ce43833acccc72e3365af8840b24e4bb32d37679fc7750ceb Apr 17 11:25:11.570439 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:11.570352 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-spbck" event={"ID":"19232515-f74d-45a3-b93f-4f513984904f","Type":"ContainerStarted","Data":"998e24d06d1e0e8ce43833acccc72e3365af8840b24e4bb32d37679fc7750ceb"} Apr 17 11:25:13.579067 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:13.579031 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-spbck" event={"ID":"19232515-f74d-45a3-b93f-4f513984904f","Type":"ContainerStarted","Data":"038496e9fbff5bb7228091716c7b4a5d5be00e5032c495f104df645d2054290d"} Apr 17 11:25:13.579472 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:13.579094 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:25:13.597645 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:13.597595 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-spbck" podStartSLOduration=1.8689416300000001 podStartE2EDuration="3.597581538s" podCreationTimestamp="2026-04-17 11:25:10 +0000 UTC" firstStartedPulling="2026-04-17 11:25:11.345369717 +0000 UTC m=+535.115244852" lastFinishedPulling="2026-04-17 11:25:13.074009624 +0000 UTC m=+536.843884760" observedRunningTime="2026-04-17 11:25:13.595365876 +0000 UTC m=+537.365241035" watchObservedRunningTime="2026-04-17 11:25:13.597581538 +0000 UTC m=+537.367456694" Apr 17 11:25:24.588806 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:25:24.588775 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-spbck" Apr 17 11:26:09.419903 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:09.419868 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6qj6k_eadbbb24-1dd1-4f83-ae66-0948bf92370e/global-pull-secret-syncer/0.log" Apr 17 11:26:09.642819 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:09.642786 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vjnsr_048dc60e-f359-4a3d-b877-c94afe6b9af6/konnectivity-agent/0.log" Apr 17 11:26:09.691852 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:09.691816 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-188.ec2.internal_0fb21237899613074192be5ee06d9825/haproxy/0.log" Apr 17 11:26:13.222769 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:13.222736 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kz9q2_de979cf4-c06f-45f0-bb82-7b6441f7d69e/kube-state-metrics/0.log" Apr 17 11:26:13.252693 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:13.252661 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kz9q2_de979cf4-c06f-45f0-bb82-7b6441f7d69e/kube-rbac-proxy-main/0.log" Apr 17 11:26:13.283476 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:13.283449 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kz9q2_de979cf4-c06f-45f0-bb82-7b6441f7d69e/kube-rbac-proxy-self/0.log" Apr 17 11:26:13.354321 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:13.354294 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-qhcnm_90ea982f-1e36-4df2-8ffa-fd560a3bbe91/monitoring-plugin/0.log" Apr 17 11:26:13.384088 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:13.384057 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9qspg_9c71ff3f-da65-4319-bd6a-2594634af03d/node-exporter/0.log" Apr 17 11:26:13.407168 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:13.407124 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9qspg_9c71ff3f-da65-4319-bd6a-2594634af03d/kube-rbac-proxy/0.log" Apr 17 11:26:13.432399 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:13.432377 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9qspg_9c71ff3f-da65-4319-bd6a-2594634af03d/init-textfile/0.log" Apr 17 11:26:13.948059 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:13.948026 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-6xz72_4c7fd490-e851-41be-8fd5-544739a025da/prometheus-operator-admission-webhook/0.log" Apr 17 11:26:14.053013 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:14.052982 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6465b67555-vz9g8_9044080e-7489-4b71-a984-595c13b74fbf/thanos-query/0.log" Apr 17 11:26:14.074340 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:14.074310 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6465b67555-vz9g8_9044080e-7489-4b71-a984-595c13b74fbf/kube-rbac-proxy-web/0.log" Apr 17 11:26:14.102589 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:14.102563 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6465b67555-vz9g8_9044080e-7489-4b71-a984-595c13b74fbf/kube-rbac-proxy/0.log" Apr 17 11:26:14.131533 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:14.131505 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6465b67555-vz9g8_9044080e-7489-4b71-a984-595c13b74fbf/prom-label-proxy/0.log" Apr 17 11:26:14.162093 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:14.162068 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6465b67555-vz9g8_9044080e-7489-4b71-a984-595c13b74fbf/kube-rbac-proxy-rules/0.log" Apr 17 11:26:14.193442 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:14.193412 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6465b67555-vz9g8_9044080e-7489-4b71-a984-595c13b74fbf/kube-rbac-proxy-metrics/0.log" Apr 17 11:26:15.266926 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:15.266894 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-cnlnp_53f4c1d9-1f56-473f-9820-14038f70b6c5/networking-console-plugin/0.log" Apr 17 11:26:15.718688 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:15.718658 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/1.log" Apr 17 11:26:15.723054 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:15.723032 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/2.log" Apr 17 11:26:16.675655 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.675627 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/1.log" Apr 17 11:26:16.677251 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.677227 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ch5s_29717902-39f6-4c43-9cb6-a981d0f5b344/console-operator/1.log" Apr 17 11:26:16.732632 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.732604 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t"] Apr 17 11:26:16.736091 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.736073 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.738866 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.738837 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkw4c\"/\"openshift-service-ca.crt\"" Apr 17 11:26:16.738866 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.738849 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkw4c\"/\"kube-root-ca.crt\"" Apr 17 11:26:16.739942 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.739924 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xkw4c\"/\"default-dockercfg-vtcwd\"" Apr 17 11:26:16.743463 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.743380 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t"] Apr 17 11:26:16.791242 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.791211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-sys\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.791394 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.791316 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-proc\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.791394 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.791345 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-lib-modules\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.791394 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.791367 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g725\" (UniqueName: \"kubernetes.io/projected/cfca8689-7c64-48be-8e72-9abe6f4e833a-kube-api-access-5g725\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.791502 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.791397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-podres\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892417 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892383 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-lib-modules\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892417 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892419 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g725\" (UniqueName: \"kubernetes.io/projected/cfca8689-7c64-48be-8e72-9abe6f4e833a-kube-api-access-5g725\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892658 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892445 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-podres\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892658 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892548 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-podres\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892658 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892559 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-lib-modules\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892658 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892613 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-sys\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892835 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892719 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-proc\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892835 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892741 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-sys\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.892835 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.892810 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cfca8689-7c64-48be-8e72-9abe6f4e833a-proc\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:16.901496 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:16.901475 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g725\" (UniqueName: \"kubernetes.io/projected/cfca8689-7c64-48be-8e72-9abe6f4e833a-kube-api-access-5g725\") pod \"perf-node-gather-daemonset-dqv5t\" (UID: \"cfca8689-7c64-48be-8e72-9abe6f4e833a\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:17.047597 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.047559 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:17.186344 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.186319 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t"] Apr 17 11:26:17.189241 ip-10-0-135-188 kubenswrapper[2580]: W0417 11:26:17.189208 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcfca8689_7c64_48be_8e72_9abe6f4e833a.slice/crio-99097641f2d2dc8533ef25a0e78e92b12fb00d10aa132d2e4e07cccbcea9b226 WatchSource:0}: Error finding container 99097641f2d2dc8533ef25a0e78e92b12fb00d10aa132d2e4e07cccbcea9b226: Status 404 returned error can't find the container with id 99097641f2d2dc8533ef25a0e78e92b12fb00d10aa132d2e4e07cccbcea9b226 Apr 17 11:26:17.358761 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.358682 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mwrb7_7991569c-ec27-417a-8b37-b1129ca90932/dns/0.log" Apr 17 11:26:17.379468 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.379440 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mwrb7_7991569c-ec27-417a-8b37-b1129ca90932/kube-rbac-proxy/0.log" Apr 17 11:26:17.471157 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.471101 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kgg54_6830c0fb-a8f5-4cb8-a8a1-5f307a010dcb/dns-node-resolver/0.log" Apr 17 11:26:17.801082 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.801044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" event={"ID":"cfca8689-7c64-48be-8e72-9abe6f4e833a","Type":"ContainerStarted","Data":"452b365c7f071b1fa89e46f4efa099dcc0fa4bcc2c6f48172bb9eb977cc31d72"} Apr 17 11:26:17.801499 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.801089 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" event={"ID":"cfca8689-7c64-48be-8e72-9abe6f4e833a","Type":"ContainerStarted","Data":"99097641f2d2dc8533ef25a0e78e92b12fb00d10aa132d2e4e07cccbcea9b226"} Apr 17 11:26:17.801499 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.801169 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:17.820436 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.820387 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" podStartSLOduration=1.82037062 podStartE2EDuration="1.82037062s" podCreationTimestamp="2026-04-17 11:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:26:17.818755756 +0000 UTC m=+601.588630915" watchObservedRunningTime="2026-04-17 11:26:17.82037062 +0000 UTC m=+601.590245846" Apr 17 11:26:17.898482 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.898452 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-d4d9cc967-whvw8_d2f73caf-8c19-40b8-a6b8-f066ed884db8/registry/0.log" Apr 17 11:26:17.926837 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:17.926807 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bfkcr_2604f0e7-0ee7-4d02-adf7-f046ecf35e36/node-ca/0.log" Apr 17 11:26:19.057804 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:19.057770 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b4df6_0e6ca43a-9f71-4557-be86-206743aee65b/serve-healthcheck-canary/0.log" Apr 17 11:26:19.463157 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:19.463096 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-4qnqt_4e15de31-bb9b-4066-b6d3-3121da3283ed/insights-operator/1.log" Apr 17 11:26:19.463357 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:19.463343 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-4qnqt_4e15de31-bb9b-4066-b6d3-3121da3283ed/insights-operator/0.log" Apr 17 11:26:19.623912 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:19.623874 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-srvwb_7c2bbce7-232a-4fe0-bb21-1fc7feafdfff/kube-rbac-proxy/0.log" Apr 17 11:26:19.647668 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:19.647638 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-srvwb_7c2bbce7-232a-4fe0-bb21-1fc7feafdfff/exporter/0.log" Apr 17 11:26:19.669443 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:19.669416 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-srvwb_7c2bbce7-232a-4fe0-bb21-1fc7feafdfff/extractor/0.log" Apr 17 11:26:21.619479 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:21.619454 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7dcb9f9f85-b2xvt_10196e76-d24e-4b01-8ef5-bf65cddc8e3c/manager/0.log" Apr 17 11:26:21.666399 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:21.666368 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-spbck_19232515-f74d-45a3-b93f-4f513984904f/server/0.log" Apr 17 11:26:21.745587 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:21.745560 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-s6x2f_98a8ec98-b14e-4f7d-8ac3-701b47dccfef/seaweedfs/0.log" Apr 17 11:26:23.814157 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:23.814117 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-dqv5t" Apr 17 11:26:25.970124 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:25.970083 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-zd4c6_457a4466-3a04-4448-93fa-458f79dfc2e7/kube-storage-version-migrator-operator/1.log" Apr 17 11:26:25.971244 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:25.971218 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-zd4c6_457a4466-3a04-4448-93fa-458f79dfc2e7/kube-storage-version-migrator-operator/0.log" Apr 17 11:26:27.356755 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.356724 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fdfx7_f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d/kube-multus-additional-cni-plugins/0.log" Apr 17 11:26:27.383605 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.383572 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fdfx7_f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d/egress-router-binary-copy/0.log" Apr 17 11:26:27.406395 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.406366 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fdfx7_f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d/cni-plugins/0.log" Apr 17 11:26:27.430297 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.430269 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fdfx7_f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d/bond-cni-plugin/0.log" Apr 17 11:26:27.450085 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.450062 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fdfx7_f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d/routeoverride-cni/0.log" Apr 17 11:26:27.474223 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.474193 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fdfx7_f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d/whereabouts-cni-bincopy/0.log" Apr 17 11:26:27.495439 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.495414 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fdfx7_f53bbbe9-bf96-40ee-8b77-64c3d3d8d96d/whereabouts-cni/0.log" Apr 17 11:26:27.555804 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.555769 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kxvmt_a280fe8d-c697-436b-8324-1581f46fa362/kube-multus/0.log" Apr 17 11:26:27.703874 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.703846 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-s9wws_71c35dce-5b27-4704-95a2-e390345991dc/network-metrics-daemon/0.log" Apr 17 11:26:27.725475 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:27.725431 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-s9wws_71c35dce-5b27-4704-95a2-e390345991dc/kube-rbac-proxy/0.log" Apr 17 11:26:28.490945 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:28.490916 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt59h_dada1323-9bb8-41bf-87e3-fddbcc3aa159/ovn-controller/0.log" Apr 17 11:26:28.510998 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:28.510968 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt59h_dada1323-9bb8-41bf-87e3-fddbcc3aa159/ovn-acl-logging/0.log" Apr 17 11:26:28.532857 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:28.532830 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt59h_dada1323-9bb8-41bf-87e3-fddbcc3aa159/kube-rbac-proxy-node/0.log" Apr 17 11:26:28.556601 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:28.556576 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt59h_dada1323-9bb8-41bf-87e3-fddbcc3aa159/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:26:28.576554 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:28.576534 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt59h_dada1323-9bb8-41bf-87e3-fddbcc3aa159/northd/0.log" Apr 17 11:26:28.597009 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:28.596976 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt59h_dada1323-9bb8-41bf-87e3-fddbcc3aa159/nbdb/0.log" Apr 17 11:26:28.617783 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:28.617760 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt59h_dada1323-9bb8-41bf-87e3-fddbcc3aa159/sbdb/0.log" Apr 17 11:26:28.713515 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:28.713482 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt59h_dada1323-9bb8-41bf-87e3-fddbcc3aa159/ovnkube-controller/0.log" Apr 17 11:26:30.257883 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:30.257785 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-2tfhp_10a16b0d-5319-46a0-8c7e-2b4e12d48031/check-endpoints/0.log" Apr 17 11:26:30.286545 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:30.286522 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-b97qz_37df9c48-6708-4b3b-9cca-a6c82f4f253f/network-check-target-container/0.log" Apr 17 11:26:31.300288 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:31.300262 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wjx5n_b0e81b66-d7e8-4dcf-baec-e09afe76648c/iptables-alerter/0.log" Apr 17 11:26:31.943580 ip-10-0-135-188 kubenswrapper[2580]: I0417 11:26:31.943542 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-cd864_e4eb1882-8f3f-45aa-bdf1-10c1296b7af5/tuned/0.log"