Apr 23 08:12:24.728451 ip-10-0-142-255 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:12:25.254855 ip-10-0-142-255 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:25.254855 ip-10-0-142-255 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:12:25.254855 ip-10-0-142-255 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:25.254855 ip-10-0-142-255 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:12:25.254855 ip-10-0-142-255 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:25.255553 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.254906 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:12:25.259403 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259387 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:25.259403 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259403 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259407 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259411 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259414 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259417 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259420 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259423 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259426 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259429 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259431 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259434 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259436 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259439 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259442 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259444 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259447 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259450 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259457 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259462 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:25.259469 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259466 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259468 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259471 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259474 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259477 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259479 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259482 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259485 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259487 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259490 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259493 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259495 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259498 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259500 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259502 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259505 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259508 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259510 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259512 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259516 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:25.259921 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259518 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259521 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259523 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259525 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259529 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259533 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259536 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259539 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259542 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259544 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259548 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259550 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259553 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259555 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259558 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259561 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259563 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259566 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259569 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:25.260436 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259571 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259574 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259577 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259579 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259582 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259586 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259588 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259594 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259597 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259599 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259602 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259604 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259607 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259609 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259611 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259614 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259616 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259619 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259621 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259624 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:25.260927 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259626 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259629 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259632 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259634 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259637 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259640 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259642 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.259999 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260004 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260006 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260010 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260012 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260015 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260018 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260021 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260023 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260026 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260029 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260031 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260034 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:25.261429 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260037 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260040 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260042 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260045 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260047 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260050 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260069 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260072 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260075 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260079 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260083 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260086 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260089 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260091 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260094 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260097 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260099 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260102 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260105 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:25.261906 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260107 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260110 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260113 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260115 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260118 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260120 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260123 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260126 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260129 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260138 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260142 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260144 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260147 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260150 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260153 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260155 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260158 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260161 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260163 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260166 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:25.262397 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260168 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260171 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260173 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260176 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260178 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260181 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260183 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260186 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260189 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260191 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260194 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260197 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260200 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260202 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260205 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260208 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260210 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260213 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260215 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260218 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:25.262922 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260221 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260223 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260226 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260228 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260231 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260233 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260236 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260238 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260241 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260243 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260247 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260249 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260252 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.260254 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261785 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261800 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261806 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261811 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261815 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261819 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261824 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:12:25.263434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261828 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261831 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261835 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261838 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261842 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261845 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261848 2579 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261851 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261854 2579 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261856 2579 flags.go:64] FLAG: --cloud-config="" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261859 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261862 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261867 2579 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261870 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261873 2579 flags.go:64] FLAG: --config-dir="" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261876 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261879 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261883 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261886 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261890 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261893 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261896 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261899 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261902 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261905 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:12:25.263941 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261908 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261912 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261916 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261919 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261922 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261925 2579 flags.go:64] FLAG: --enable-server="true" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261928 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261932 2579 flags.go:64] FLAG: --event-burst="100" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261936 2579 flags.go:64] FLAG: --event-qps="50" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261939 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261943 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261945 2579 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261949 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261952 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261955 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261958 2579 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261961 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261964 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261967 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261970 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261973 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261975 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261978 2579 flags.go:64] FLAG: --feature-gates="" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261982 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261985 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:12:25.264568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261988 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261992 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261995 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.261999 2579 flags.go:64] FLAG: --help="false" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262002 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262005 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262008 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262011 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262014 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262018 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262021 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262024 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262027 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262030 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262033 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262036 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262039 2579 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262042 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262045 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262048 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262051 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262066 2579 flags.go:64] FLAG: --lock-file="" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262069 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262072 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:12:25.265196 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262075 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262081 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262083 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262086 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262089 2579 flags.go:64] FLAG: --logging-format="text" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262092 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262095 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262098 2579 flags.go:64] FLAG: --manifest-url="" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262101 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262105 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262109 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262113 2579 flags.go:64] FLAG: --max-pods="110" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262116 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262119 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262122 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262125 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262128 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262131 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262134 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262141 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262145 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262147 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262150 2579 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:12:25.265817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262153 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262160 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262163 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262166 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262169 2579 flags.go:64] FLAG: --port="10250" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262172 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262175 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08c42892f5255e59f" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262179 2579 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262182 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262185 2579 flags.go:64] FLAG: --register-node="true" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262188 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262191 2579 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262194 2579 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262197 2579 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262200 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262203 2579 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262207 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262210 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262213 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262216 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262219 2579 flags.go:64] FLAG: --runonce="false" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262222 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262225 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262228 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262231 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262234 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:12:25.266385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262237 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262240 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262244 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262246 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262250 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262253 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262255 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262258 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262261 2579 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262264 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262269 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262272 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262274 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262279 2579 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262282 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262284 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262287 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262290 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262293 2579 flags.go:64] FLAG: --v="2" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262297 2579 flags.go:64] FLAG: --version="false" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262301 2579 flags.go:64] FLAG: --vmodule="" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262305 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.262308 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262401 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:25.267023 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262406 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262413 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262416 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262420 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262422 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262425 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262427 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262430 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262433 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262435 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262438 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262440 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262444 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262446 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262449 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262452 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262454 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262457 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262460 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262462 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:25.267603 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262465 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262468 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262470 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262473 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262475 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262478 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262480 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262483 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262485 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262488 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262490 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262493 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262495 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262500 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262503 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262505 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262509 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262513 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262516 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:25.268147 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262519 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262521 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262523 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262526 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262528 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262531 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262533 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262536 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262538 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262541 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262544 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262546 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262549 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262551 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262554 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262556 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262558 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262561 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262564 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262566 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:25.268630 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262568 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262571 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262574 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262577 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262579 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262582 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262585 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262588 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262591 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262593 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262596 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262599 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262602 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262604 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262607 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262609 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262612 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262614 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262616 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262619 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:25.269202 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262622 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:25.269696 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262624 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:25.269696 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262627 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:25.269696 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262629 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:25.269696 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262632 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:25.269696 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.262634 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:25.269696 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.263471 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:25.271318 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.271300 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:12:25.271360 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.271320 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:12:25.271388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271367 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:25.271388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271372 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:25.271388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271375 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:25.271388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271379 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:25.271388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271382 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:25.271388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271385 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:25.271388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271388 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:25.271388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271391 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271394 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271397 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271400 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271402 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271405 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271408 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271410 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271413 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271416 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271418 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271420 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271423 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271425 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271428 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271430 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271433 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271435 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271438 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271440 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:25.271592 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271443 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271445 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271448 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271451 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271454 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271457 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271459 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271461 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271464 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271467 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271469 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271471 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271475 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271479 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271482 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271485 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271488 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271490 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271493 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:25.272171 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271495 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271498 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271500 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271505 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271509 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271512 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271515 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271518 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271521 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271524 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271526 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271529 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271531 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271534 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271537 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271539 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271542 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271544 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271548 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:25.272635 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271552 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271554 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271557 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271559 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271562 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271564 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271567 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271570 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271573 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271576 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271578 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271581 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271584 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271586 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271589 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271591 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271593 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271596 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271599 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271602 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:25.273132 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271605 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.271610 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271710 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271715 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271718 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271721 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271724 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271727 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271730 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271733 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271736 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271739 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271743 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271745 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271748 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:25.273627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271751 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271753 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271756 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271758 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271761 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271763 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271766 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271769 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271771 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271774 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271776 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271779 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271782 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271784 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271787 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271790 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271793 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271795 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271798 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271801 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:25.274036 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271803 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271806 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271808 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271811 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271813 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271816 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271819 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271821 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271824 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271826 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271829 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271832 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271835 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271837 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271841 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271845 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271848 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271851 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271854 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:25.274561 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271857 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271860 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271862 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271864 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271868 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271871 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271874 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271876 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271879 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271881 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271884 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271886 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271888 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271891 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271894 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271896 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271899 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271901 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271904 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271906 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:25.275034 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271908 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271911 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271913 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271916 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271919 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271922 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271924 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271927 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271929 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271932 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271934 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271937 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271939 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:25.271942 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.271947 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:25.275552 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.272044 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:12:25.275927 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.274355 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:12:25.275927 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.275464 2579 server.go:1019] "Starting client certificate rotation" Apr 23 08:12:25.275927 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.275558 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:12:25.275927 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.275597 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:12:25.306252 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.306223 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:12:25.310672 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.310653 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:12:25.324851 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.324830 2579 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:12:25.332012 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.331993 2579 log.go:25] "Validated CRI v1 image API" Apr 23 08:12:25.333429 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.333398 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:12:25.335930 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.335910 2579 fs.go:135] Filesystem UUIDs: map[56ee17c1-abc2-4844-aaa6-87dc6469522e:/dev/nvme0n1p3 6fde971d-f0f4-4df1-9abc-8a5a313fb7a8:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 23 08:12:25.336006 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.335928 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:12:25.340090 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.340071 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:12:25.341693 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.341587 2579 manager.go:217] Machine: {Timestamp:2026-04-23 08:12:25.339512949 +0000 UTC m=+0.466516766 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092981 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21ad725ff1c9976a9063983b1b9347 SystemUUID:ec21ad72-5ff1-c997-6a90-63983b1b9347 BootID:0d7182de-058f-462a-9003-69e62fa8212c Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cd:8b:1e:79:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cd:8b:1e:79:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:72:a7:c0:cd:bf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:12:25.341693 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.341689 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:12:25.341832 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.341817 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:12:25.344740 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.344716 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:12:25.344869 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.344742 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-255.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:12:25.344925 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.344878 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:12:25.344925 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.344889 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:12:25.344925 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.344905 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:12:25.346576 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.346565 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:12:25.347930 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.347921 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:12:25.348035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.348025 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:12:25.350747 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.350736 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:12:25.350785 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.350754 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:12:25.350785 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.350770 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:12:25.350785 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.350779 2579 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:12:25.350875 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.350792 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:12:25.351927 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.351916 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:12:25.351976 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.351934 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:12:25.357571 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.357553 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:12:25.359372 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.359357 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:12:25.361559 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361545 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361566 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361574 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361582 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361591 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361598 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361604 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361609 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361616 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361622 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:12:25.361628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361631 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:12:25.361879 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.361641 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:12:25.362788 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.362778 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:12:25.362788 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.362788 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:12:25.366764 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.366751 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:12:25.366824 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.366786 2579 server.go:1295] "Started kubelet" Apr 23 08:12:25.366906 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.366881 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:12:25.366955 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.366896 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:12:25.367001 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.366962 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:12:25.367328 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.367307 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-255.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:12:25.367386 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.367367 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:12:25.367421 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.367396 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-255.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:12:25.367601 ip-10-0-142-255 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:12:25.368511 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.368493 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:12:25.369013 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.368998 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:12:25.375042 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.373952 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-255.ec2.internal.18a8ee3091a13935 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-255.ec2.internal,UID:ip-10-0-142-255.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-255.ec2.internal,},FirstTimestamp:2026-04-23 08:12:25.366763829 +0000 UTC m=+0.493767646,LastTimestamp:2026-04-23 08:12:25.366763829 +0000 UTC m=+0.493767646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-255.ec2.internal,}" Apr 23 08:12:25.376241 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.376211 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:12:25.376904 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.376884 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:12:25.377472 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.377456 2579 factory.go:55] Registering systemd factory Apr 23 08:12:25.377556 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.377477 2579 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:12:25.377723 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.377709 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:12:25.377774 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.377710 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:12:25.377774 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.377735 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:12:25.377851 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.377835 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:12:25.377851 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.377844 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:12:25.377981 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.377966 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:25.378078 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.378050 2579 factory.go:153] Registering CRI-O factory Apr 23 08:12:25.378142 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.378083 2579 factory.go:223] Registration of the crio container factory successfully Apr 23 08:12:25.378142 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.378133 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:12:25.378239 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.378166 2579 factory.go:103] Registering Raw factory Apr 23 08:12:25.378239 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.378182 2579 manager.go:1196] Started watching for new ooms in manager Apr 23 08:12:25.378547 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.378533 2579 manager.go:319] Starting recovery of all containers Apr 23 08:12:25.379325 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.379280 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:12:25.382175 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.382149 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-255.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 08:12:25.382270 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.382215 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 08:12:25.386444 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.386420 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f2vmq" Apr 23 08:12:25.388246 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.388227 2579 manager.go:324] Recovery completed Apr 23 08:12:25.390625 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.390606 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 23 08:12:25.393518 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.393505 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:25.394979 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.394964 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f2vmq" Apr 23 08:12:25.396278 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.396264 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:25.396332 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.396293 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:25.396332 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.396306 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:25.396817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.396805 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:12:25.396817 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.396816 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:12:25.396907 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.396832 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:12:25.398154 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.398087 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-255.ec2.internal.18a8ee30936393c1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-255.ec2.internal,UID:ip-10-0-142-255.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-255.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-255.ec2.internal,},FirstTimestamp:2026-04-23 08:12:25.396278209 +0000 UTC m=+0.523282026,LastTimestamp:2026-04-23 08:12:25.396278209 +0000 UTC m=+0.523282026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-255.ec2.internal,}" Apr 23 08:12:25.400044 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.400032 2579 policy_none.go:49] "None policy: Start" Apr 23 08:12:25.400111 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.400048 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:12:25.400111 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.400075 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:12:25.445680 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.445572 2579 manager.go:341] "Starting Device Plugin manager" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.445719 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.445735 2579 server.go:85] "Starting device plugin registration server" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.445909 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.445918 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.446089 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.446157 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.446165 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.446628 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:12:25.456994 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.446662 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:25.529333 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.529271 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:12:25.530597 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.530582 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:12:25.530690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.530608 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:12:25.530690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.530628 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:12:25.530690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.530638 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:12:25.530860 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.530704 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:12:25.536527 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.536506 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:25.547012 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.546997 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:25.547736 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.547723 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:25.547810 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.547752 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:25.547810 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.547766 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:25.547810 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.547792 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.555848 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.555829 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.555848 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.555849 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-255.ec2.internal\": node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:25.572190 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.572167 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:25.631775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.631751 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal"] Apr 23 08:12:25.631849 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.631830 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:25.632613 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.632591 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:25.632697 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.632619 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:25.632697 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.632635 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:25.634793 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.634781 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:25.634932 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.634919 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.634978 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.634945 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:25.635448 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.635431 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:25.635535 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.635431 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:25.635535 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.635482 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:25.635535 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.635496 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:25.635535 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.635459 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:25.635535 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.635535 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:25.637595 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.637580 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.637652 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.637606 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:25.638651 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.638636 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:25.638732 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.638659 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:25.638732 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.638673 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:25.665632 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.665615 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-255.ec2.internal\" not found" node="ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.670010 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.669996 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-255.ec2.internal\" not found" node="ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.673050 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.673037 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:25.773999 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.773966 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:25.779314 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.779298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.779402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.779325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.779402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.779343 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e688fec9147a531ae0f3ba981a4ec304-config\") pod \"kube-apiserver-proxy-ip-10-0-142-255.ec2.internal\" (UID: \"e688fec9147a531ae0f3ba981a4ec304\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.874704 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.874668 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:25.880024 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.880007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.880143 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.880040 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.880143 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.880095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e688fec9147a531ae0f3ba981a4ec304-config\") pod \"kube-apiserver-proxy-ip-10-0-142-255.ec2.internal\" (UID: \"e688fec9147a531ae0f3ba981a4ec304\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.880143 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.880099 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.880143 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.880129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e688fec9147a531ae0f3ba981a4ec304-config\") pod \"kube-apiserver-proxy-ip-10-0-142-255.ec2.internal\" (UID: \"e688fec9147a531ae0f3ba981a4ec304\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.880283 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.880170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e121b59458e83a52d173db12d00639e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal\" (UID: \"e121b59458e83a52d173db12d00639e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.967159 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.967123 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.972845 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:25.972827 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 23 08:12:25.974827 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:25.974811 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.075613 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:26.075534 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.176126 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:26.176097 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.274762 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.274734 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:12:26.275293 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.274899 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:12:26.276878 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:26.276859 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.377146 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:26.377050 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.377146 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.377088 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:12:26.388936 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.388916 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:12:26.397658 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.397630 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:07:25 +0000 UTC" deadline="2027-12-12 15:03:01.777364102 +0000 UTC" Apr 23 08:12:26.397658 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.397657 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14358h50m35.379709807s" Apr 23 08:12:26.407849 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.407830 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:26.410614 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.410594 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jwftl" Apr 23 08:12:26.418699 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.418678 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jwftl" Apr 23 08:12:26.477211 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:26.477189 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.512711 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:26.512680 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode121b59458e83a52d173db12d00639e1.slice/crio-d71830b266f7160990b3366b8bb328a2e5ddd73fd99ef11630c4032e0f3511bd WatchSource:0}: Error finding container d71830b266f7160990b3366b8bb328a2e5ddd73fd99ef11630c4032e0f3511bd: Status 404 returned error can't find the container with id d71830b266f7160990b3366b8bb328a2e5ddd73fd99ef11630c4032e0f3511bd Apr 23 08:12:26.513231 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:26.513215 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode688fec9147a531ae0f3ba981a4ec304.slice/crio-8b47fbcf16ad78a2850b6024674a94320f5ca89bd688fb33636a969f401b5fd4 WatchSource:0}: Error finding container 8b47fbcf16ad78a2850b6024674a94320f5ca89bd688fb33636a969f401b5fd4: Status 404 returned error can't find the container with id 8b47fbcf16ad78a2850b6024674a94320f5ca89bd688fb33636a969f401b5fd4 Apr 23 08:12:26.516872 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.516858 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:12:26.533908 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.533868 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" event={"ID":"e688fec9147a531ae0f3ba981a4ec304","Type":"ContainerStarted","Data":"8b47fbcf16ad78a2850b6024674a94320f5ca89bd688fb33636a969f401b5fd4"} Apr 23 08:12:26.534882 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.534864 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" event={"ID":"e121b59458e83a52d173db12d00639e1","Type":"ContainerStarted","Data":"d71830b266f7160990b3366b8bb328a2e5ddd73fd99ef11630c4032e0f3511bd"} Apr 23 08:12:26.578103 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:26.578083 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.678562 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:26.678494 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.778944 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:26.778917 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-255.ec2.internal\" not found" Apr 23 08:12:26.875657 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.875630 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:26.877628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.877611 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" Apr 23 08:12:26.899009 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.898989 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:12:26.900504 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.900486 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" Apr 23 08:12:26.914714 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.914695 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:12:26.932349 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:26.932299 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:27.351843 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.351783 2579 apiserver.go:52] "Watching apiserver" Apr 23 08:12:27.359895 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.359870 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:12:27.360940 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.360914 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-7mwd7","kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal","openshift-dns/node-resolver-hnjm9","openshift-image-registry/node-ca-9b2bf","openshift-multus/multus-fj94f","openshift-multus/network-metrics-daemon-d5mzc","openshift-network-diagnostics/network-check-target-c4bd2","openshift-ovn-kubernetes/ovnkube-node-svpkl","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc","openshift-cluster-node-tuning-operator/tuned-bbc8p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal","openshift-multus/multus-additional-cni-plugins-h756m","openshift-network-operator/iptables-alerter-ddlvz"] Apr 23 08:12:27.363699 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.363680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:27.363780 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:27.363753 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:27.368716 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.368696 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.370646 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.370629 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.371182 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.371128 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:12:27.371272 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.371187 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:12:27.371272 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.371190 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7qspq\"" Apr 23 08:12:27.372772 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.372752 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:12:27.372954 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.372934 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.373035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.373023 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:27.373120 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.372941 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:12:27.373120 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:27.373109 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:27.373268 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.373250 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vkhph\"" Apr 23 08:12:27.373392 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.373373 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:12:27.375174 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.375156 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:27.376071 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.376040 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:12:27.376265 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.376246 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nl2hr\"" Apr 23 08:12:27.376265 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.376248 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:12:27.376418 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.376408 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:12:27.376751 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.376735 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:12:27.377663 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.377380 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8gssx\"" Apr 23 08:12:27.377663 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.377403 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:12:27.377663 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.377431 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:12:27.379904 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.379888 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.380003 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.379954 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.382328 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.382307 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:12:27.382415 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.382405 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.382736 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.382720 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:12:27.382927 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.382913 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:12:27.382992 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.382976 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:12:27.383044 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.383002 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:12:27.383044 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.383030 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qz2cb\"" Apr 23 08:12:27.383158 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.382976 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:12:27.383347 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.383332 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:12:27.383401 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.383344 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-x5zd7\"" Apr 23 08:12:27.383401 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.383348 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:12:27.383493 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.383332 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:12:27.384698 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.384680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.384698 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.384693 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:12:27.384834 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.384757 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gmfdz\"" Apr 23 08:12:27.385162 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.385144 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:12:27.387099 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.386818 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4jxws\"" Apr 23 08:12:27.387099 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.386906 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:12:27.387240 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.387134 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:12:27.387300 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.387280 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.388186 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-sys-fs\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.388258 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-system-cni-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.388319 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388276 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-cni-bin\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.388319 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:27.388422 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388361 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-etc-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.388479 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388428 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.388479 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388447 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-env-overrides\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.388588 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovnkube-script-lib\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.388588 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt5nw\" (UniqueName: \"kubernetes.io/projected/10f6f586-998e-4725-bb51-e801aba526fe-kube-api-access-dt5nw\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.388588 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388549 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-systemd-units\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.388588 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388565 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-run-ovn-kubernetes\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.388775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388635 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10f6f586-998e-4725-bb51-e801aba526fe-serviceca\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.388775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388688 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-conf-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.389015 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.388992 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf4bc19e-bb95-4e2e-9978-4a53f064696d-tmp-dir\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.389093 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.389040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-kubelet\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.389154 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.389094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovn-node-metrics-cert\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.389154 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.389128 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-registration-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.389599 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.389581 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:12:27.390116 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-cnibin\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.390201 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-hostroot\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.390201 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-etc-kubernetes\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.390315 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390230 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-node-log\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.390368 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-cni-bin\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.390368 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390347 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-socket-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.390464 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-cni-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.390464 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390447 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-k8s-cni-cncf-io\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.390564 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390477 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/015aecae-ba6f-4d84-946d-58733117d34f-multus-daemon-config\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.390564 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/65461c01-4562-4d4b-86d7-6491c2bd2b8c-konnectivity-ca\") pod \"konnectivity-agent-7mwd7\" (UID: \"65461c01-4562-4d4b-86d7-6491c2bd2b8c\") " pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:27.390564 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390559 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-systemd\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.390710 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390577 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jmfxl\"" Apr 23 08:12:27.390710 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390590 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovnkube-config\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.390710 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390615 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-etc-selinux\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.390710 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-socket-dir-parent\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.390710 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxqxr\" (UniqueName: \"kubernetes.io/projected/cf4bc19e-bb95-4e2e-9978-4a53f064696d-kube-api-access-vxqxr\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.390710 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390703 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:12:27.391035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.391035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt82s\" (UniqueName: \"kubernetes.io/projected/3f1e47b2-e320-437a-b287-7a3cb3b8613f-kube-api-access-kt82s\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.391035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390806 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:12:27.391035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-netns\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.391035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390852 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/015aecae-ba6f-4d84-946d-58733117d34f-cni-binary-copy\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.391035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-device-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.391035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.390978 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-cni-multus\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.391035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391013 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-multus-certs\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-cni-netd\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-run-netns\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391231 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-var-lib-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391259 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/65461c01-4562-4d4b-86d7-6491c2bd2b8c-agent-certs\") pod \"konnectivity-agent-7mwd7\" (UID: \"65461c01-4562-4d4b-86d7-6491c2bd2b8c\") " pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391304 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf4bc19e-bb95-4e2e-9978-4a53f064696d-hosts-file\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt7mx\" (UniqueName: \"kubernetes.io/projected/41fa5042-9289-494c-9973-953c5146e01c-kube-api-access-vt7mx\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:27.391420 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391381 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10f6f586-998e-4725-bb51-e801aba526fe-host\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.391842 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-ovn\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.391842 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-log-socket\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.391842 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391527 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6lzl\" (UniqueName: \"kubernetes.io/projected/ec9efb36-5e54-40e4-9ff4-f25ef8172507-kube-api-access-l6lzl\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.391842 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-os-release\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.391842 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391607 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-kubelet\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.391842 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwrlb\" (UniqueName: \"kubernetes.io/projected/015aecae-ba6f-4d84-946d-58733117d34f-kube-api-access-rwrlb\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.391842 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.391671 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-slash\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.419974 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.419949 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:07:26 +0000 UTC" deadline="2027-12-18 03:04:47.168738856 +0000 UTC" Apr 23 08:12:27.419974 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.419973 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14490h52m19.748768368s" Apr 23 08:12:27.478869 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.478843 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:12:27.492181 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-device-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.492285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-cni-multus\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.492285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-multus-certs\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.492285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-cni-netd\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-cni-multus\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-device-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-multus-certs\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492352 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492360 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492345 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-cni-netd\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.492394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-kubernetes\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-run-netns\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-var-lib-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-modprobe-d\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/65461c01-4562-4d4b-86d7-6491c2bd2b8c-agent-certs\") pod \"konnectivity-agent-7mwd7\" (UID: \"65461c01-4562-4d4b-86d7-6491c2bd2b8c\") " pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf4bc19e-bb95-4e2e-9978-4a53f064696d-hosts-file\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-run-netns\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt7mx\" (UniqueName: \"kubernetes.io/projected/41fa5042-9289-494c-9973-953c5146e01c-kube-api-access-vt7mx\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-lib-modules\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-var-lib-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10f6f586-998e-4725-bb51-e801aba526fe-host\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf4bc19e-bb95-4e2e-9978-4a53f064696d-hosts-file\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492716 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10f6f586-998e-4725-bb51-e801aba526fe-host\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysctl-d\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.492753 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-run\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492789 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-ovn\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492817 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-log-socket\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-ovn\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6lzl\" (UniqueName: \"kubernetes.io/projected/ec9efb36-5e54-40e4-9ff4-f25ef8172507-kube-api-access-l6lzl\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492876 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-os-release\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492889 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-log-socket\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492908 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-kubelet\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492954 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-os-release\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwrlb\" (UniqueName: \"kubernetes.io/projected/015aecae-ba6f-4d84-946d-58733117d34f-kube-api-access-rwrlb\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.492983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-kubelet\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-host\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-slash\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-sys-fs\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-system-cni-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493103 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-slash\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-cni-bin\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493148 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-var-lib-cni-bin\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493132 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-sys-fs\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-system-cni-dir\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493166 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493180 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-system-cni-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-cni-binary-copy\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-etc-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:27.493221 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493254 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-etc-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-env-overrides\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-openvswitch\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:27.493308 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs podName:41fa5042-9289-494c-9973-953c5146e01c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:27.993275604 +0000 UTC m=+3.120279407 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs") pod "network-metrics-daemon-d5mzc" (UID: "41fa5042-9289-494c-9973-953c5146e01c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovnkube-script-lib\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493351 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dt5nw\" (UniqueName: \"kubernetes.io/projected/10f6f586-998e-4725-bb51-e801aba526fe-kube-api-access-dt5nw\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-systemd-units\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.493856 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-run-ovn-kubernetes\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493513 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10f6f586-998e-4725-bb51-e801aba526fe-serviceca\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493511 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-systemd-units\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-conf-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf4bc19e-bb95-4e2e-9978-4a53f064696d-tmp-dir\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-run-ovn-kubernetes\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493646 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-conf-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-systemd\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493766 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a63e511b-0189-48fb-bcb8-1878f4bec538-iptables-alerter-script\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493799 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-env-overrides\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-kubelet\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493830 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-kubelet\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovn-node-metrics-cert\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493875 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-registration-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-cnibin\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-hostroot\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-etc-kubernetes\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.494510 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493949 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10f6f586-998e-4725-bb51-e801aba526fe-serviceca\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493965 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovnkube-script-lib\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493973 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kdp\" (UniqueName: \"kubernetes.io/projected/a63e511b-0189-48fb-bcb8-1878f4bec538-kube-api-access-n4kdp\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493965 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf4bc19e-bb95-4e2e-9978-4a53f064696d-tmp-dir\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.493998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-node-log\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494004 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-etc-kubernetes\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-hostroot\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-cni-bin\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-registration-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-node-log\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494099 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-cnibin\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494098 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-socket-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-host-cni-bin\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-cni-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494200 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-socket-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-k8s-cni-cncf-io\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494252 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-cni-dir\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-k8s-cni-cncf-io\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.495532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/015aecae-ba6f-4d84-946d-58733117d34f-multus-daemon-config\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/65461c01-4562-4d4b-86d7-6491c2bd2b8c-konnectivity-ca\") pod \"konnectivity-agent-7mwd7\" (UID: \"65461c01-4562-4d4b-86d7-6491c2bd2b8c\") " pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-systemd\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovnkube-config\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-etc-selinux\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec9efb36-5e54-40e4-9ff4-f25ef8172507-run-systemd\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-socket-dir-parent\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494522 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxqxr\" (UniqueName: \"kubernetes.io/projected/cf4bc19e-bb95-4e2e-9978-4a53f064696d-kube-api-access-vxqxr\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-os-release\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be039d04-f58e-488c-9676-348a83fcc83e-etc-tuned\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494596 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt82s\" (UniqueName: \"kubernetes.io/projected/3f1e47b2-e320-437a-b287-7a3cb3b8613f-kube-api-access-kt82s\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-sys\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-multus-socket-dir-parent\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494661 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be039d04-f58e-488c-9676-348a83fcc83e-tmp\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494716 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2872\" (UniqueName: \"kubernetes.io/projected/be039d04-f58e-488c-9676-348a83fcc83e-kube-api-access-g2872\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a63e511b-0189-48fb-bcb8-1878f4bec538-host-slash\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.496351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494845 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-netns\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-cnibin\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skklt\" (UniqueName: \"kubernetes.io/projected/325bc232-cf8c-46f7-a278-679124fa4e09-kube-api-access-skklt\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/65461c01-4562-4d4b-86d7-6491c2bd2b8c-konnectivity-ca\") pod \"konnectivity-agent-7mwd7\" (UID: \"65461c01-4562-4d4b-86d7-6491c2bd2b8c\") " pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.494972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysctl-conf\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.495009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/015aecae-ba6f-4d84-946d-58733117d34f-host-run-netns\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.495011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-var-lib-kubelet\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.495080 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-etc-selinux\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.495101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/015aecae-ba6f-4d84-946d-58733117d34f-cni-binary-copy\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.495127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysconfig\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.495153 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f1e47b2-e320-437a-b287-7a3cb3b8613f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.495612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/015aecae-ba6f-4d84-946d-58733117d34f-cni-binary-copy\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.495971 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovnkube-config\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.497166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.496399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/015aecae-ba6f-4d84-946d-58733117d34f-multus-daemon-config\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.497603 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.497301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec9efb36-5e54-40e4-9ff4-f25ef8172507-ovn-node-metrics-cert\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.497603 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.497410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/65461c01-4562-4d4b-86d7-6491c2bd2b8c-agent-certs\") pod \"konnectivity-agent-7mwd7\" (UID: \"65461c01-4562-4d4b-86d7-6491c2bd2b8c\") " pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:27.510217 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:27.510130 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:27.510217 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:27.510180 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:27.510217 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:27.510195 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7pdll for pod openshift-network-diagnostics/network-check-target-c4bd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:27.510490 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.510190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwrlb\" (UniqueName: \"kubernetes.io/projected/015aecae-ba6f-4d84-946d-58733117d34f-kube-api-access-rwrlb\") pod \"multus-fj94f\" (UID: \"015aecae-ba6f-4d84-946d-58733117d34f\") " pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.510490 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:27.510381 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll podName:7aedc835-f4c8-4970-ba3b-30679f2aa6e9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:28.010362056 +0000 UTC m=+3.137365866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7pdll" (UniqueName: "kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll") pod "network-check-target-c4bd2" (UID: "7aedc835-f4c8-4970-ba3b-30679f2aa6e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:27.511241 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.511215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6lzl\" (UniqueName: \"kubernetes.io/projected/ec9efb36-5e54-40e4-9ff4-f25ef8172507-kube-api-access-l6lzl\") pod \"ovnkube-node-svpkl\" (UID: \"ec9efb36-5e54-40e4-9ff4-f25ef8172507\") " pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.512701 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.512677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt5nw\" (UniqueName: \"kubernetes.io/projected/10f6f586-998e-4725-bb51-e801aba526fe-kube-api-access-dt5nw\") pod \"node-ca-9b2bf\" (UID: \"10f6f586-998e-4725-bb51-e801aba526fe\") " pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.513698 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.513662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt82s\" (UniqueName: \"kubernetes.io/projected/3f1e47b2-e320-437a-b287-7a3cb3b8613f-kube-api-access-kt82s\") pod \"aws-ebs-csi-driver-node-zltrc\" (UID: \"3f1e47b2-e320-437a-b287-7a3cb3b8613f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.515010 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.514987 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxqxr\" (UniqueName: \"kubernetes.io/projected/cf4bc19e-bb95-4e2e-9978-4a53f064696d-kube-api-access-vxqxr\") pod \"node-resolver-hnjm9\" (UID: \"cf4bc19e-bb95-4e2e-9978-4a53f064696d\") " pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.515211 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.515188 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt7mx\" (UniqueName: \"kubernetes.io/projected/41fa5042-9289-494c-9973-953c5146e01c-kube-api-access-vt7mx\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:27.595907 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.595878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-modprobe-d\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596072 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.595915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-lib-modules\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596072 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.595931 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysctl-d\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596072 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.595949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-run\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596072 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.595970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.596072 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.595992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-host\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596072 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-system-cni-dir\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-cni-binary-copy\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-run\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596092 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-lib-modules\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysctl-d\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-modprobe-d\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596139 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-host\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596140 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-systemd\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-system-cni-dir\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-systemd\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596189 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a63e511b-0189-48fb-bcb8-1878f4bec538-iptables-alerter-script\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kdp\" (UniqueName: \"kubernetes.io/projected/a63e511b-0189-48fb-bcb8-1878f4bec538-kube-api-access-n4kdp\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-os-release\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596277 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be039d04-f58e-488c-9676-348a83fcc83e-etc-tuned\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-sys\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.596367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596364 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be039d04-f58e-488c-9676-348a83fcc83e-tmp\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596366 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-os-release\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2872\" (UniqueName: \"kubernetes.io/projected/be039d04-f58e-488c-9676-348a83fcc83e-kube-api-access-g2872\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a63e511b-0189-48fb-bcb8-1878f4bec538-host-slash\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-cnibin\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596605 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-cni-binary-copy\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skklt\" (UniqueName: \"kubernetes.io/projected/325bc232-cf8c-46f7-a278-679124fa4e09-kube-api-access-skklt\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysctl-conf\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596671 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a63e511b-0189-48fb-bcb8-1878f4bec538-host-slash\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596682 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/325bc232-cf8c-46f7-a278-679124fa4e09-cnibin\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596684 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-var-lib-kubelet\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-var-lib-kubelet\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a63e511b-0189-48fb-bcb8-1878f4bec538-iptables-alerter-script\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysconfig\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596743 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-sys\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596764 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.597178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-kubernetes\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597780 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysctl-conf\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597780 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-sysconfig\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597780 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.596891 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be039d04-f58e-488c-9676-348a83fcc83e-etc-kubernetes\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.597780 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.597229 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/325bc232-cf8c-46f7-a278-679124fa4e09-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.598646 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.598616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be039d04-f58e-488c-9676-348a83fcc83e-etc-tuned\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.599078 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.599044 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be039d04-f58e-488c-9676-348a83fcc83e-tmp\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.612267 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.612219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kdp\" (UniqueName: \"kubernetes.io/projected/a63e511b-0189-48fb-bcb8-1878f4bec538-kube-api-access-n4kdp\") pod \"iptables-alerter-ddlvz\" (UID: \"a63e511b-0189-48fb-bcb8-1878f4bec538\") " pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:27.612448 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.612428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skklt\" (UniqueName: \"kubernetes.io/projected/325bc232-cf8c-46f7-a278-679124fa4e09-kube-api-access-skklt\") pod \"multus-additional-cni-plugins-h756m\" (UID: \"325bc232-cf8c-46f7-a278-679124fa4e09\") " pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.612509 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.612432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2872\" (UniqueName: \"kubernetes.io/projected/be039d04-f58e-488c-9676-348a83fcc83e-kube-api-access-g2872\") pod \"tuned-bbc8p\" (UID: \"be039d04-f58e-488c-9676-348a83fcc83e\") " pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.678389 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.678333 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hnjm9" Apr 23 08:12:27.680604 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.680584 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:27.684779 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.684762 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9b2bf" Apr 23 08:12:27.694273 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.694250 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fj94f" Apr 23 08:12:27.700068 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.700022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:27.705694 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.705676 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:27.712296 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.712278 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" Apr 23 08:12:27.718818 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.718796 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" Apr 23 08:12:27.724370 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.724352 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h756m" Apr 23 08:12:27.729845 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:27.729827 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ddlvz" Apr 23 08:12:28.000705 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.000632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:28.000826 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:28.000751 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:28.000826 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:28.000810 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs podName:41fa5042-9289-494c-9973-953c5146e01c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:29.000789614 +0000 UTC m=+4.127793418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs") pod "network-metrics-daemon-d5mzc" (UID: "41fa5042-9289-494c-9973-953c5146e01c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:28.101891 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.101857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:28.102033 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:28.102006 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:28.102033 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:28.102024 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:28.102033 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:28.102033 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7pdll for pod openshift-network-diagnostics/network-check-target-c4bd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:28.102170 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:28.102105 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll podName:7aedc835-f4c8-4970-ba3b-30679f2aa6e9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:29.102091083 +0000 UTC m=+4.229094886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pdll" (UniqueName: "kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll") pod "network-check-target-c4bd2" (UID: "7aedc835-f4c8-4970-ba3b-30679f2aa6e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:28.225650 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.225608 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9efb36_5e54_40e4_9ff4_f25ef8172507.slice/crio-2edda7bddbe1b46a97793a4dcb8a41697f6b799f36aea0fb92b4d9b2592e62b4 WatchSource:0}: Error finding container 2edda7bddbe1b46a97793a4dcb8a41697f6b799f36aea0fb92b4d9b2592e62b4: Status 404 returned error can't find the container with id 2edda7bddbe1b46a97793a4dcb8a41697f6b799f36aea0fb92b4d9b2592e62b4 Apr 23 08:12:28.226325 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.226301 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf4bc19e_bb95_4e2e_9978_4a53f064696d.slice/crio-5c7915a4d8bec2b63f7951a3acfc3faa52312a35953c32255f21dbc085a3ad75 WatchSource:0}: Error finding container 5c7915a4d8bec2b63f7951a3acfc3faa52312a35953c32255f21dbc085a3ad75: Status 404 returned error can't find the container with id 5c7915a4d8bec2b63f7951a3acfc3faa52312a35953c32255f21dbc085a3ad75 Apr 23 08:12:28.227309 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.227286 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f1e47b2_e320_437a_b287_7a3cb3b8613f.slice/crio-b27b8375990fece05a0330f165213ae6dfbb8a0d37e5b80368da24c19742ee36 WatchSource:0}: Error finding container b27b8375990fece05a0330f165213ae6dfbb8a0d37e5b80368da24c19742ee36: Status 404 returned error can't find the container with id b27b8375990fece05a0330f165213ae6dfbb8a0d37e5b80368da24c19742ee36 Apr 23 08:12:28.227956 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.227875 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe039d04_f58e_488c_9676_348a83fcc83e.slice/crio-0085a76564da2607b0dd6e6af73df35c85c918421ae9df1b6e1b41bd867c19f4 WatchSource:0}: Error finding container 0085a76564da2607b0dd6e6af73df35c85c918421ae9df1b6e1b41bd867c19f4: Status 404 returned error can't find the container with id 0085a76564da2607b0dd6e6af73df35c85c918421ae9df1b6e1b41bd867c19f4 Apr 23 08:12:28.228855 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.228782 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63e511b_0189_48fb_bcb8_1878f4bec538.slice/crio-7edcd23e01078b5e07eeb123b2b13a309883be5c17a30174069b57ef8a2f2983 WatchSource:0}: Error finding container 7edcd23e01078b5e07eeb123b2b13a309883be5c17a30174069b57ef8a2f2983: Status 404 returned error can't find the container with id 7edcd23e01078b5e07eeb123b2b13a309883be5c17a30174069b57ef8a2f2983 Apr 23 08:12:28.229780 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.229755 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod325bc232_cf8c_46f7_a278_679124fa4e09.slice/crio-e78f22e7ee62151d7e74991992b69dc642f4bd70404373022eeaad80975153d2 WatchSource:0}: Error finding container e78f22e7ee62151d7e74991992b69dc642f4bd70404373022eeaad80975153d2: Status 404 returned error can't find the container with id e78f22e7ee62151d7e74991992b69dc642f4bd70404373022eeaad80975153d2 Apr 23 08:12:28.231611 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.231595 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f6f586_998e_4725_bb51_e801aba526fe.slice/crio-5d0667ef7e2d8452398dc2e6cbad2eecef9ce2fce0240cc2b97027bb30e79244 WatchSource:0}: Error finding container 5d0667ef7e2d8452398dc2e6cbad2eecef9ce2fce0240cc2b97027bb30e79244: Status 404 returned error can't find the container with id 5d0667ef7e2d8452398dc2e6cbad2eecef9ce2fce0240cc2b97027bb30e79244 Apr 23 08:12:28.252050 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.252014 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65461c01_4562_4d4b_86d7_6491c2bd2b8c.slice/crio-916fc5d6ea39b33fb82205558014fb981afe06c77fda4f692a223336ceb58a56 WatchSource:0}: Error finding container 916fc5d6ea39b33fb82205558014fb981afe06c77fda4f692a223336ceb58a56: Status 404 returned error can't find the container with id 916fc5d6ea39b33fb82205558014fb981afe06c77fda4f692a223336ceb58a56 Apr 23 08:12:28.252790 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:12:28.252771 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015aecae_ba6f_4d84_946d_58733117d34f.slice/crio-ee3371b4f982a8334da59d5604092bedc3610a2c1f86a075ffb38f402fd412e2 WatchSource:0}: Error finding container ee3371b4f982a8334da59d5604092bedc3610a2c1f86a075ffb38f402fd412e2: Status 404 returned error can't find the container with id ee3371b4f982a8334da59d5604092bedc3610a2c1f86a075ffb38f402fd412e2 Apr 23 08:12:28.421205 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.421041 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:07:26 +0000 UTC" deadline="2027-12-13 17:14:27.412616369 +0000 UTC" Apr 23 08:12:28.421205 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.421202 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14385h1m58.991417572s" Apr 23 08:12:28.531641 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.531573 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:28.531756 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:28.531678 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:28.538485 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.538451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fj94f" event={"ID":"015aecae-ba6f-4d84-946d-58733117d34f","Type":"ContainerStarted","Data":"ee3371b4f982a8334da59d5604092bedc3610a2c1f86a075ffb38f402fd412e2"} Apr 23 08:12:28.539375 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.539350 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7mwd7" event={"ID":"65461c01-4562-4d4b-86d7-6491c2bd2b8c","Type":"ContainerStarted","Data":"916fc5d6ea39b33fb82205558014fb981afe06c77fda4f692a223336ceb58a56"} Apr 23 08:12:28.540431 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.540413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h756m" event={"ID":"325bc232-cf8c-46f7-a278-679124fa4e09","Type":"ContainerStarted","Data":"e78f22e7ee62151d7e74991992b69dc642f4bd70404373022eeaad80975153d2"} Apr 23 08:12:28.542819 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.542797 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" event={"ID":"be039d04-f58e-488c-9676-348a83fcc83e","Type":"ContainerStarted","Data":"0085a76564da2607b0dd6e6af73df35c85c918421ae9df1b6e1b41bd867c19f4"} Apr 23 08:12:28.543604 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.543585 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" event={"ID":"3f1e47b2-e320-437a-b287-7a3cb3b8613f","Type":"ContainerStarted","Data":"b27b8375990fece05a0330f165213ae6dfbb8a0d37e5b80368da24c19742ee36"} Apr 23 08:12:28.545285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.545267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hnjm9" event={"ID":"cf4bc19e-bb95-4e2e-9978-4a53f064696d","Type":"ContainerStarted","Data":"5c7915a4d8bec2b63f7951a3acfc3faa52312a35953c32255f21dbc085a3ad75"} Apr 23 08:12:28.547074 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.547039 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9b2bf" event={"ID":"10f6f586-998e-4725-bb51-e801aba526fe","Type":"ContainerStarted","Data":"5d0667ef7e2d8452398dc2e6cbad2eecef9ce2fce0240cc2b97027bb30e79244"} Apr 23 08:12:28.548096 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.548049 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ddlvz" event={"ID":"a63e511b-0189-48fb-bcb8-1878f4bec538","Type":"ContainerStarted","Data":"7edcd23e01078b5e07eeb123b2b13a309883be5c17a30174069b57ef8a2f2983"} Apr 23 08:12:28.550140 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.550092 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"2edda7bddbe1b46a97793a4dcb8a41697f6b799f36aea0fb92b4d9b2592e62b4"} Apr 23 08:12:28.551679 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:28.551659 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" event={"ID":"e688fec9147a531ae0f3ba981a4ec304","Type":"ContainerStarted","Data":"7358270e52792c851f41c1281455d2b75e97c77f85cb8c03294bc8f8110ecbb3"} Apr 23 08:12:29.007814 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:29.007728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:29.007957 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:29.007892 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:29.008026 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:29.007970 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs podName:41fa5042-9289-494c-9973-953c5146e01c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:31.007951162 +0000 UTC m=+6.134954971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs") pod "network-metrics-daemon-d5mzc" (UID: "41fa5042-9289-494c-9973-953c5146e01c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:29.108353 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:29.108296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:29.108509 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:29.108473 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:29.108509 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:29.108493 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:29.108509 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:29.108505 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7pdll for pod openshift-network-diagnostics/network-check-target-c4bd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:29.108610 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:29.108564 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll podName:7aedc835-f4c8-4970-ba3b-30679f2aa6e9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:31.108544453 +0000 UTC m=+6.235548266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pdll" (UniqueName: "kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll") pod "network-check-target-c4bd2" (UID: "7aedc835-f4c8-4970-ba3b-30679f2aa6e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:29.537777 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:29.537746 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:29.538307 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:29.537863 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:29.565840 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:29.565769 2579 generic.go:358] "Generic (PLEG): container finished" podID="e121b59458e83a52d173db12d00639e1" containerID="ebec9803ee81d7f5136d64089ef127065d8d082dd3a173454a270b5797ebb304" exitCode=0 Apr 23 08:12:29.566747 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:29.566725 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" event={"ID":"e121b59458e83a52d173db12d00639e1","Type":"ContainerDied","Data":"ebec9803ee81d7f5136d64089ef127065d8d082dd3a173454a270b5797ebb304"} Apr 23 08:12:29.584545 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:29.584498 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-255.ec2.internal" podStartSLOduration=3.584483185 podStartE2EDuration="3.584483185s" podCreationTimestamp="2026-04-23 08:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:12:28.569204296 +0000 UTC m=+3.696208125" watchObservedRunningTime="2026-04-23 08:12:29.584483185 +0000 UTC m=+4.711487017" Apr 23 08:12:30.531721 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:30.531656 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:30.531882 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:30.531789 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:30.592016 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:30.591981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" event={"ID":"e121b59458e83a52d173db12d00639e1","Type":"ContainerStarted","Data":"344b8290808bfecc631c2d47c1d24f0fed0c55cb59ea56a5136bf1263ed3bc72"} Apr 23 08:12:30.607823 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:30.607768 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-255.ec2.internal" podStartSLOduration=4.607749896 podStartE2EDuration="4.607749896s" podCreationTimestamp="2026-04-23 08:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:12:30.607320093 +0000 UTC m=+5.734323923" watchObservedRunningTime="2026-04-23 08:12:30.607749896 +0000 UTC m=+5.734753723" Apr 23 08:12:31.022687 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:31.022609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:31.022845 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:31.022809 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:31.022901 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:31.022874 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs podName:41fa5042-9289-494c-9973-953c5146e01c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:35.022855804 +0000 UTC m=+10.149859616 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs") pod "network-metrics-daemon-d5mzc" (UID: "41fa5042-9289-494c-9973-953c5146e01c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:31.123341 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:31.123303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:31.123508 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:31.123482 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:31.123508 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:31.123502 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:31.123610 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:31.123516 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7pdll for pod openshift-network-diagnostics/network-check-target-c4bd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:31.123610 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:31.123581 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll podName:7aedc835-f4c8-4970-ba3b-30679f2aa6e9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:35.12356227 +0000 UTC m=+10.250566082 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pdll" (UniqueName: "kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll") pod "network-check-target-c4bd2" (UID: "7aedc835-f4c8-4970-ba3b-30679f2aa6e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:31.533508 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:31.533023 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:31.533508 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:31.533164 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:32.531151 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:32.531114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:32.531594 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:32.531264 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:33.262560 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.262521 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x8j9k"] Apr 23 08:12:33.268052 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.267696 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.268052 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:33.267776 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:33.342209 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.342035 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.342209 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.342102 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/98eee7d8-bae0-438a-887a-8591437a310c-dbus\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.342209 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.342132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/98eee7d8-bae0-438a-887a-8591437a310c-kubelet-config\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.443038 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.442997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/98eee7d8-bae0-438a-887a-8591437a310c-dbus\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.443208 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.443070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/98eee7d8-bae0-438a-887a-8591437a310c-kubelet-config\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.443208 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.443130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.443292 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:33.443242 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:33.443324 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:33.443292 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret podName:98eee7d8-bae0-438a-887a-8591437a310c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:33.943279577 +0000 UTC m=+9.070283381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret") pod "global-pull-secret-syncer-x8j9k" (UID: "98eee7d8-bae0-438a-887a-8591437a310c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:33.443690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.443661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/98eee7d8-bae0-438a-887a-8591437a310c-dbus\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.443797 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.443660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/98eee7d8-bae0-438a-887a-8591437a310c-kubelet-config\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.533614 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.533272 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:33.533614 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:33.533387 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:33.947688 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:33.947132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:33.947688 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:33.947271 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:33.947688 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:33.947352 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret podName:98eee7d8-bae0-438a-887a-8591437a310c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:34.947331985 +0000 UTC m=+10.074335793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret") pod "global-pull-secret-syncer-x8j9k" (UID: "98eee7d8-bae0-438a-887a-8591437a310c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:34.531483 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:34.530953 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:34.531483 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:34.531108 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:34.954762 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:34.954665 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:34.955161 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:34.954860 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:34.955161 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:34.954927 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret podName:98eee7d8-bae0-438a-887a-8591437a310c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:36.954906632 +0000 UTC m=+12.081910441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret") pod "global-pull-secret-syncer-x8j9k" (UID: "98eee7d8-bae0-438a-887a-8591437a310c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:35.055117 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:35.055081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:35.055279 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:35.055193 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:35.055279 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:35.055238 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs podName:41fa5042-9289-494c-9973-953c5146e01c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:43.055225409 +0000 UTC m=+18.182229212 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs") pod "network-metrics-daemon-d5mzc" (UID: "41fa5042-9289-494c-9973-953c5146e01c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:35.155781 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:35.155741 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:35.155960 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:35.155932 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:35.155960 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:35.155955 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:35.156114 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:35.155969 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7pdll for pod openshift-network-diagnostics/network-check-target-c4bd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:35.156114 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:35.156024 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll podName:7aedc835-f4c8-4970-ba3b-30679f2aa6e9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:43.156006972 +0000 UTC m=+18.283010782 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pdll" (UniqueName: "kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll") pod "network-check-target-c4bd2" (UID: "7aedc835-f4c8-4970-ba3b-30679f2aa6e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:35.532541 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:35.532516 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:35.532685 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:35.532592 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:35.532740 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:35.532682 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:35.532807 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:35.532786 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:36.531745 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:36.531705 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:36.532224 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:36.531842 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:36.969737 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:36.969645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:36.970160 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:36.970131 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:36.970262 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:36.970219 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret podName:98eee7d8-bae0-438a-887a-8591437a310c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:40.970198911 +0000 UTC m=+16.097202733 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret") pod "global-pull-secret-syncer-x8j9k" (UID: "98eee7d8-bae0-438a-887a-8591437a310c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:37.534198 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:37.533765 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:37.534198 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:37.533879 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:37.534198 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:37.533881 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:37.534198 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:37.533948 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:38.531187 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:38.531140 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:38.531364 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:38.531277 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:39.534402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:39.534362 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:39.534871 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:39.534363 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:39.534871 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:39.534465 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:39.534871 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:39.534551 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:40.531599 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:40.531562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:40.531770 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:40.531677 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:40.996323 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:40.996282 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:40.996758 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:40.996413 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:40.996758 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:40.996474 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret podName:98eee7d8-bae0-438a-887a-8591437a310c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:48.996455335 +0000 UTC m=+24.123459139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret") pod "global-pull-secret-syncer-x8j9k" (UID: "98eee7d8-bae0-438a-887a-8591437a310c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:41.531217 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:41.531185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:41.531399 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:41.531359 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:41.531459 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:41.531414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:41.531546 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:41.531529 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:42.531833 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:42.531790 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:42.532316 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:42.531918 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:43.111118 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:43.111077 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:43.111296 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:43.111166 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:43.111296 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:43.111217 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs podName:41fa5042-9289-494c-9973-953c5146e01c nodeName:}" failed. No retries permitted until 2026-04-23 08:12:59.111202214 +0000 UTC m=+34.238206018 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs") pod "network-metrics-daemon-d5mzc" (UID: "41fa5042-9289-494c-9973-953c5146e01c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:43.212352 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:43.212313 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:43.212519 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:43.212440 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:43.212519 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:43.212454 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:43.212519 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:43.212464 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7pdll for pod openshift-network-diagnostics/network-check-target-c4bd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:43.212519 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:43.212521 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll podName:7aedc835-f4c8-4970-ba3b-30679f2aa6e9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:59.212501035 +0000 UTC m=+34.339504860 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pdll" (UniqueName: "kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll") pod "network-check-target-c4bd2" (UID: "7aedc835-f4c8-4970-ba3b-30679f2aa6e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:43.534132 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:43.534048 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:43.534532 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:43.534070 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:43.534532 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:43.534196 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:43.534532 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:43.534238 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:44.531644 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:44.531613 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:44.531813 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:44.531737 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:45.535894 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:45.534848 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:45.535894 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:45.535076 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:45.535894 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:45.535375 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:45.536568 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:45.536544 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:45.616267 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:45.616030 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hnjm9" event={"ID":"cf4bc19e-bb95-4e2e-9978-4a53f064696d","Type":"ContainerStarted","Data":"42c97eb6710bc81ff8b64cba850a90e114d7956c8b04893c89c61ee18ad0eaee"} Apr 23 08:12:45.620795 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:45.620395 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7mwd7" event={"ID":"65461c01-4562-4d4b-86d7-6491c2bd2b8c","Type":"ContainerStarted","Data":"ed1b76e7be98614caa3b70b411d107e5ccd965fcb390154f99b7ff13ff1b3407"} Apr 23 08:12:45.630293 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:45.630257 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hnjm9" podStartSLOduration=3.458141834 podStartE2EDuration="20.630245077s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.228206273 +0000 UTC m=+3.355210077" lastFinishedPulling="2026-04-23 08:12:45.400309513 +0000 UTC m=+20.527313320" observedRunningTime="2026-04-23 08:12:45.629831316 +0000 UTC m=+20.756835143" watchObservedRunningTime="2026-04-23 08:12:45.630245077 +0000 UTC m=+20.757248903" Apr 23 08:12:45.692512 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:45.692484 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:45.692991 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:45.692966 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:45.708213 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:45.708109 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7mwd7" podStartSLOduration=3.580558505 podStartE2EDuration="20.708088351s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.25468337 +0000 UTC m=+3.381687173" lastFinishedPulling="2026-04-23 08:12:45.382213214 +0000 UTC m=+20.509217019" observedRunningTime="2026-04-23 08:12:45.645251595 +0000 UTC m=+20.772255420" watchObservedRunningTime="2026-04-23 08:12:45.708088351 +0000 UTC m=+20.835092181" Apr 23 08:12:46.531555 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.531528 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:46.531714 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:46.531642 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:46.630265 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.630236 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-acl-logging/0.log" Apr 23 08:12:46.630980 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.630577 2579 generic.go:358] "Generic (PLEG): container finished" podID="ec9efb36-5e54-40e4-9ff4-f25ef8172507" containerID="374e7764e037496f01ecfc75460213ae3962df60ca43f90981e86189e022e526" exitCode=1 Apr 23 08:12:46.630980 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.630608 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"b46b79cb0346c9cac66f0319699bedd0165eabe7fcc6f0fd0ea47997e7dfef6c"} Apr 23 08:12:46.630980 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.630642 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"e26b46296381fd74ecf397304b5305c451e0637a7f5c8d3b13f4c41daaa23cf1"} Apr 23 08:12:46.630980 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.630656 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"28c4ea5a334431ea8c481729c053e3edc360c3e52f5137d11dfcc5631b369e98"} Apr 23 08:12:46.630980 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.630669 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"a67a017ef80ad70b108bbb3c3449eecf3df7830a78f0b8cd5d869517d0f158a8"} Apr 23 08:12:46.630980 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.630681 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerDied","Data":"374e7764e037496f01ecfc75460213ae3962df60ca43f90981e86189e022e526"} Apr 23 08:12:46.630980 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.630694 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"cf3f506475f91336ec4c5d126cf58b42ca01cf8b74da642c193806585a40752c"} Apr 23 08:12:46.632224 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.632201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fj94f" event={"ID":"015aecae-ba6f-4d84-946d-58733117d34f","Type":"ContainerStarted","Data":"cfde7e3f17d5077de8b86ed0af325044f70a0a2ed2324d13334488244adb6073"} Apr 23 08:12:46.633767 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.633745 2579 generic.go:358] "Generic (PLEG): container finished" podID="325bc232-cf8c-46f7-a278-679124fa4e09" containerID="9bb16c83a804d6bd8d29fe4a9fe8a7e7abe6c123d95dd80dd878eaee20dc02ad" exitCode=0 Apr 23 08:12:46.633859 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.633818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h756m" event={"ID":"325bc232-cf8c-46f7-a278-679124fa4e09","Type":"ContainerDied","Data":"9bb16c83a804d6bd8d29fe4a9fe8a7e7abe6c123d95dd80dd878eaee20dc02ad"} Apr 23 08:12:46.637929 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.637903 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" event={"ID":"be039d04-f58e-488c-9676-348a83fcc83e","Type":"ContainerStarted","Data":"53272a9d8f7ac89a22bfc8f265a3a1accce4e4db6a2b55459f66ed640e92f698"} Apr 23 08:12:46.639415 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.639389 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" event={"ID":"3f1e47b2-e320-437a-b287-7a3cb3b8613f","Type":"ContainerStarted","Data":"084a60e2fe00eee0246331454237364fe11a60a5ee43bfbfb2a9a61b56c6b2d9"} Apr 23 08:12:46.640640 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.640605 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9b2bf" event={"ID":"10f6f586-998e-4725-bb51-e801aba526fe","Type":"ContainerStarted","Data":"f85cc8f156d149d7c48ad2ce5b6a9352964959b7610a9fee507feaee870df479"} Apr 23 08:12:46.641039 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.641015 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:46.641422 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.641399 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7mwd7" Apr 23 08:12:46.651022 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.650970 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fj94f" podStartSLOduration=4.486977997 podStartE2EDuration="21.650956413s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.254692917 +0000 UTC m=+3.381696720" lastFinishedPulling="2026-04-23 08:12:45.41867133 +0000 UTC m=+20.545675136" observedRunningTime="2026-04-23 08:12:46.650788297 +0000 UTC m=+21.777792122" watchObservedRunningTime="2026-04-23 08:12:46.650956413 +0000 UTC m=+21.777960260" Apr 23 08:12:46.702414 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.702369 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bbc8p" podStartSLOduration=4.550554158 podStartE2EDuration="21.70235384s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.230340598 +0000 UTC m=+3.357344401" lastFinishedPulling="2026-04-23 08:12:45.382140278 +0000 UTC m=+20.509144083" observedRunningTime="2026-04-23 08:12:46.701879059 +0000 UTC m=+21.828882886" watchObservedRunningTime="2026-04-23 08:12:46.70235384 +0000 UTC m=+21.829357665" Apr 23 08:12:46.715505 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:46.715465 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9b2bf" podStartSLOduration=4.566272078 podStartE2EDuration="21.715453889s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.251126561 +0000 UTC m=+3.378130369" lastFinishedPulling="2026-04-23 08:12:45.400308375 +0000 UTC m=+20.527312180" observedRunningTime="2026-04-23 08:12:46.715322428 +0000 UTC m=+21.842326253" watchObservedRunningTime="2026-04-23 08:12:46.715453889 +0000 UTC m=+21.842457709" Apr 23 08:12:47.078675 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.078647 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:12:47.460325 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.460160 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:12:47.078670216Z","UUID":"eadb6c37-211b-4bf3-8915-fad2511d951f","Handler":null,"Name":"","Endpoint":""} Apr 23 08:12:47.462531 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.462508 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:12:47.462677 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.462544 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:12:47.532893 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.532841 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:47.533109 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:47.532941 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:47.533340 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.533324 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:47.533415 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:47.533390 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:47.645755 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.645719 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" event={"ID":"3f1e47b2-e320-437a-b287-7a3cb3b8613f","Type":"ContainerStarted","Data":"bcb12b4f3c1f77a093f90482cc7af4bd9261a08b2ac9a40c88756f25d51b142e"} Apr 23 08:12:47.648137 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.648108 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ddlvz" event={"ID":"a63e511b-0189-48fb-bcb8-1878f4bec538","Type":"ContainerStarted","Data":"0cdeccc62bf8d36196d926d90d425bf6065f7daca8f1df2adced48e2ee4945cb"} Apr 23 08:12:47.663375 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:47.663334 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ddlvz" podStartSLOduration=5.487218271 podStartE2EDuration="22.663317304s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.230939415 +0000 UTC m=+3.357943227" lastFinishedPulling="2026-04-23 08:12:45.407038454 +0000 UTC m=+20.534042260" observedRunningTime="2026-04-23 08:12:47.662879415 +0000 UTC m=+22.789883241" watchObservedRunningTime="2026-04-23 08:12:47.663317304 +0000 UTC m=+22.790321130" Apr 23 08:12:48.531127 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:48.531034 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:48.531268 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:48.531156 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:48.652039 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:48.651994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" event={"ID":"3f1e47b2-e320-437a-b287-7a3cb3b8613f","Type":"ContainerStarted","Data":"9d7575ae494d71b0f9157525fbbfa12f5230289f229b6c195562aa8da0f90e68"} Apr 23 08:12:49.059276 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:49.059241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:49.059460 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:49.059401 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:49.059513 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:49.059468 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret podName:98eee7d8-bae0-438a-887a-8591437a310c nodeName:}" failed. No retries permitted until 2026-04-23 08:13:05.059454218 +0000 UTC m=+40.186458022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret") pod "global-pull-secret-syncer-x8j9k" (UID: "98eee7d8-bae0-438a-887a-8591437a310c") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:49.531864 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:49.531667 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:49.532037 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:49.531667 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:49.532037 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:49.531954 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:49.532037 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:49.532025 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:49.657230 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:49.657203 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-acl-logging/0.log" Apr 23 08:12:49.657610 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:49.657519 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"54a15403ebda57ce34364123f5d89692a6463665952c88975cbd3b913f168dc3"} Apr 23 08:12:50.530976 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:50.530943 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:50.531173 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:50.531090 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:51.530998 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.530916 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:51.531401 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.530920 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:51.531401 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:51.531021 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:51.531401 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:51.531106 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:51.662879 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.662846 2579 generic.go:358] "Generic (PLEG): container finished" podID="325bc232-cf8c-46f7-a278-679124fa4e09" containerID="a91face72cff86af0bfe97ed582c07c2c22d4b156f8946d8b3125199dd2b7205" exitCode=0 Apr 23 08:12:51.663036 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.662927 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h756m" event={"ID":"325bc232-cf8c-46f7-a278-679124fa4e09","Type":"ContainerDied","Data":"a91face72cff86af0bfe97ed582c07c2c22d4b156f8946d8b3125199dd2b7205"} Apr 23 08:12:51.666189 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.666166 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-acl-logging/0.log" Apr 23 08:12:51.666565 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.666542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"7826bcbf0c948be4984c1d0a0363fe2efc94e6c02dafa46f425b4424020a2c42"} Apr 23 08:12:51.666883 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.666864 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:51.666962 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.666888 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:51.667024 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.667010 2579 scope.go:117] "RemoveContainer" containerID="374e7764e037496f01ecfc75460213ae3962df60ca43f90981e86189e022e526" Apr 23 08:12:51.683691 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.683671 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:51.686499 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:51.686455 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zltrc" podStartSLOduration=6.78469902 podStartE2EDuration="26.686439027s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.22988675 +0000 UTC m=+3.356890554" lastFinishedPulling="2026-04-23 08:12:48.131626745 +0000 UTC m=+23.258630561" observedRunningTime="2026-04-23 08:12:48.67620334 +0000 UTC m=+23.803207168" watchObservedRunningTime="2026-04-23 08:12:51.686439027 +0000 UTC m=+26.813442854" Apr 23 08:12:52.531823 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.531799 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:52.532238 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:52.531945 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:52.673113 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.673027 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-acl-logging/0.log" Apr 23 08:12:52.673508 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.673474 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" event={"ID":"ec9efb36-5e54-40e4-9ff4-f25ef8172507","Type":"ContainerStarted","Data":"a77abd357c723933e521f3d1e69ab6c9c62f8e9917bd3c3f553e623002437732"} Apr 23 08:12:52.674345 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.674315 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:52.676041 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.676016 2579 generic.go:358] "Generic (PLEG): container finished" podID="325bc232-cf8c-46f7-a278-679124fa4e09" containerID="9380baa2d6b3c1684830db9c8d97427d0d03a91c10cd28b88d0f35a1a6b305c6" exitCode=0 Apr 23 08:12:52.676160 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.676085 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h756m" event={"ID":"325bc232-cf8c-46f7-a278-679124fa4e09","Type":"ContainerDied","Data":"9380baa2d6b3c1684830db9c8d97427d0d03a91c10cd28b88d0f35a1a6b305c6"} Apr 23 08:12:52.689593 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.689566 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:12:52.705492 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.705449 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" podStartSLOduration=10.283734658 podStartE2EDuration="27.705437096s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.227394875 +0000 UTC m=+3.354398693" lastFinishedPulling="2026-04-23 08:12:45.649097313 +0000 UTC m=+20.776101131" observedRunningTime="2026-04-23 08:12:52.705302147 +0000 UTC m=+27.832305974" watchObservedRunningTime="2026-04-23 08:12:52.705437096 +0000 UTC m=+27.832440923" Apr 23 08:12:52.912517 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.912489 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d5mzc"] Apr 23 08:12:52.912675 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.912603 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:52.912721 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:52.912695 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:52.918283 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.918261 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x8j9k"] Apr 23 08:12:52.918399 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.918344 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:52.918444 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:52.918410 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:52.926003 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.925940 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-c4bd2"] Apr 23 08:12:52.926119 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:52.926016 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:52.926119 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:52.926090 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:53.680078 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:53.680032 2579 generic.go:358] "Generic (PLEG): container finished" podID="325bc232-cf8c-46f7-a278-679124fa4e09" containerID="ce4404f32c8c6357179841c17a791c60e7e050817ee976b30706664343fc7cd5" exitCode=0 Apr 23 08:12:53.680627 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:53.680098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h756m" event={"ID":"325bc232-cf8c-46f7-a278-679124fa4e09","Type":"ContainerDied","Data":"ce4404f32c8c6357179841c17a791c60e7e050817ee976b30706664343fc7cd5"} Apr 23 08:12:54.531600 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:54.531567 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:54.531773 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:54.531568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:54.531773 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:54.531680 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:54.531907 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:54.531778 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:54.531907 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:54.531578 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:54.531907 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:54.531876 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:56.531215 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:56.531185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:56.531992 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:56.531185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:56.531992 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:56.531293 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:56.531992 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:56.531347 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:56.531992 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:56.531185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:56.531992 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:56.531490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:58.531348 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:58.531314 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:12:58.531910 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:58.531350 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:58.531910 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:58.531438 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:58.531910 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:58.531442 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x8j9k" podUID="98eee7d8-bae0-438a-887a-8591437a310c" Apr 23 08:12:58.531910 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:58.531526 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c4bd2" podUID="7aedc835-f4c8-4970-ba3b-30679f2aa6e9" Apr 23 08:12:58.531910 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:58.531618 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5mzc" podUID="41fa5042-9289-494c-9973-953c5146e01c" Apr 23 08:12:59.142005 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.141971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:12:59.142192 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.142115 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:59.142259 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.142199 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs podName:41fa5042-9289-494c-9973-953c5146e01c nodeName:}" failed. No retries permitted until 2026-04-23 08:13:31.142182985 +0000 UTC m=+66.269186789 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs") pod "network-metrics-daemon-d5mzc" (UID: "41fa5042-9289-494c-9973-953c5146e01c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:59.243489 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.243261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:12:59.243655 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.243455 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:59.243655 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.243538 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:59.243655 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.243554 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7pdll for pod openshift-network-diagnostics/network-check-target-c4bd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:59.243655 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.243611 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll podName:7aedc835-f4c8-4970-ba3b-30679f2aa6e9 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:31.243593237 +0000 UTC m=+66.370597057 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pdll" (UniqueName: "kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll") pod "network-check-target-c4bd2" (UID: "7aedc835-f4c8-4970-ba3b-30679f2aa6e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:59.674385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.674362 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-255.ec2.internal" event="NodeReady" Apr 23 08:12:59.674660 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.674461 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:12:59.708663 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.708599 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-84ff9c579b-q6lss"] Apr 23 08:12:59.738154 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.738127 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s2qrh"] Apr 23 08:12:59.738399 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.738373 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.741150 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.740998 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-24qw2\"" Apr 23 08:12:59.741150 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.740998 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:12:59.741150 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.741033 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:12:59.741150 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.741147 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:12:59.749078 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.749029 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:12:59.756260 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.756237 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84ff9c579b-q6lss"] Apr 23 08:12:59.756345 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.756270 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8969w"] Apr 23 08:12:59.756387 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.756355 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.758769 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.758749 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:12:59.758769 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.758762 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dcms6\"" Apr 23 08:12:59.758886 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.758753 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:12:59.777135 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.777109 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8969w"] Apr 23 08:12:59.777212 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.777147 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s2qrh"] Apr 23 08:12:59.777249 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.777210 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:12:59.779642 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.779621 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:12:59.780217 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.780199 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:12:59.780351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.780333 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:12:59.780416 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.780339 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sb9bx\"" Apr 23 08:12:59.849090 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-bound-sa-token\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.849213 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849101 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-installation-pull-secrets\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.849213 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849126 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7580ab91-3a6f-405a-b1b4-402044f4cd59-ca-trust-extracted\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.849213 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.849213 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849164 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-image-registry-private-configuration\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.849213 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-certificates\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.849402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849246 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-config-volume\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.849402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-tmp-dir\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.849402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9v7r\" (UniqueName: \"kubernetes.io/projected/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-kube-api-access-v9v7r\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.849402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849320 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-trusted-ca\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.849402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849352 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.849402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.849372 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv5ss\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-kube-api-access-tv5ss\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.950529 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950501 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-trusted-ca\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.950625 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.950625 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv5ss\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-kube-api-access-tv5ss\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.950625 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950604 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:12:59.950775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-bound-sa-token\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.950775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-installation-pull-secrets\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.950775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7580ab91-3a6f-405a-b1b4-402044f4cd59-ca-trust-extracted\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.950775 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.950722 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:12:59.950775 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.950745 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84ff9c579b-q6lss: secret "image-registry-tls" not found Apr 23 08:12:59.950775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.951049 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950792 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-image-registry-private-configuration\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.951049 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.950836 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls podName:7580ab91-3a6f-405a-b1b4-402044f4cd59 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:00.450809802 +0000 UTC m=+35.577813606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls") pod "image-registry-84ff9c579b-q6lss" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59") : secret "image-registry-tls" not found Apr 23 08:12:59.951049 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-certificates\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.951049 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950901 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qh6t\" (UniqueName: \"kubernetes.io/projected/0721f5e7-8efa-45d5-a74c-776e204f81c6-kube-api-access-9qh6t\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:12:59.951049 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-config-volume\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.951049 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-tmp-dir\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.951049 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.950997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9v7r\" (UniqueName: \"kubernetes.io/projected/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-kube-api-access-v9v7r\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.951536 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.951510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-tmp-dir\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.951617 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.951575 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-certificates\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.951673 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.951630 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:12:59.951728 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.951666 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-config-volume\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.951728 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:12:59.951692 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls podName:b1a03cf0-a55d-4e77-9ca0-a33f942b3b24 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:00.451670283 +0000 UTC m=+35.578674088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls") pod "dns-default-s2qrh" (UID: "b1a03cf0-a55d-4e77-9ca0-a33f942b3b24") : secret "dns-default-metrics-tls" not found Apr 23 08:12:59.951808 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.951725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-trusted-ca\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.951808 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.951757 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7580ab91-3a6f-405a-b1b4-402044f4cd59-ca-trust-extracted\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.956031 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.956011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-installation-pull-secrets\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.956031 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.956019 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-image-registry-private-configuration\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.962641 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.962590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv5ss\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-kube-api-access-tv5ss\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:12:59.963100 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.963080 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9v7r\" (UniqueName: \"kubernetes.io/projected/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-kube-api-access-v9v7r\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:12:59.971816 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:12:59.971800 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-bound-sa-token\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:13:00.052265 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.052242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qh6t\" (UniqueName: \"kubernetes.io/projected/0721f5e7-8efa-45d5-a74c-776e204f81c6-kube-api-access-9qh6t\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:00.052346 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.052309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:00.052431 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.052419 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:00.052473 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.052469 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert podName:0721f5e7-8efa-45d5-a74c-776e204f81c6 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:00.552457291 +0000 UTC m=+35.679461095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert") pod "ingress-canary-8969w" (UID: "0721f5e7-8efa-45d5-a74c-776e204f81c6") : secret "canary-serving-cert" not found Apr 23 08:13:00.064873 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.064856 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qh6t\" (UniqueName: \"kubernetes.io/projected/0721f5e7-8efa-45d5-a74c-776e204f81c6-kube-api-access-9qh6t\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:00.454377 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.454341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:13:00.454524 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.454401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:13:00.454524 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.454485 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:13:00.454524 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.454496 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84ff9c579b-q6lss: secret "image-registry-tls" not found Apr 23 08:13:00.454634 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.454484 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:00.454634 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.454550 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls podName:7580ab91-3a6f-405a-b1b4-402044f4cd59 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:01.45453538 +0000 UTC m=+36.581539184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls") pod "image-registry-84ff9c579b-q6lss" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59") : secret "image-registry-tls" not found Apr 23 08:13:00.454634 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.454601 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls podName:b1a03cf0-a55d-4e77-9ca0-a33f942b3b24 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:01.454576704 +0000 UTC m=+36.581580509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls") pod "dns-default-s2qrh" (UID: "b1a03cf0-a55d-4e77-9ca0-a33f942b3b24") : secret "dns-default-metrics-tls" not found Apr 23 08:13:00.531076 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.531030 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:13:00.531076 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.531069 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:13:00.531274 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.531030 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:13:00.533723 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.533694 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f9mps\"" Apr 23 08:13:00.533836 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.533771 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:13:00.533836 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.533830 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:13:00.533955 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.533886 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:13:00.533955 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.533904 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:13:00.533955 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.533907 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mdjsc\"" Apr 23 08:13:00.554966 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.554949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:00.555049 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.555034 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:00.555123 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:00.555098 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert podName:0721f5e7-8efa-45d5-a74c-776e204f81c6 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:01.555082792 +0000 UTC m=+36.682086609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert") pod "ingress-canary-8969w" (UID: "0721f5e7-8efa-45d5-a74c-776e204f81c6") : secret "canary-serving-cert" not found Apr 23 08:13:00.696858 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.696829 2579 generic.go:358] "Generic (PLEG): container finished" podID="325bc232-cf8c-46f7-a278-679124fa4e09" containerID="626ccf369d30179226bf1a0035aab7bb876987d25bef1fca701d99ab12d1d3fd" exitCode=0 Apr 23 08:13:00.697240 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:00.696884 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h756m" event={"ID":"325bc232-cf8c-46f7-a278-679124fa4e09","Type":"ContainerDied","Data":"626ccf369d30179226bf1a0035aab7bb876987d25bef1fca701d99ab12d1d3fd"} Apr 23 08:13:01.461765 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:01.461719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:13:01.461954 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:01.461828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:13:01.462013 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:01.461959 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:13:01.462013 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:01.461966 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:01.462130 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:01.462052 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls podName:b1a03cf0-a55d-4e77-9ca0-a33f942b3b24 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:03.462031194 +0000 UTC m=+38.589034997 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls") pod "dns-default-s2qrh" (UID: "b1a03cf0-a55d-4e77-9ca0-a33f942b3b24") : secret "dns-default-metrics-tls" not found Apr 23 08:13:01.464400 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:01.461976 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84ff9c579b-q6lss: secret "image-registry-tls" not found Apr 23 08:13:01.464400 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:01.462276 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls podName:7580ab91-3a6f-405a-b1b4-402044f4cd59 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:03.462252569 +0000 UTC m=+38.589256392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls") pod "image-registry-84ff9c579b-q6lss" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59") : secret "image-registry-tls" not found Apr 23 08:13:01.562547 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:01.562519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:01.562687 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:01.562638 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:01.562727 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:01.562690 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert podName:0721f5e7-8efa-45d5-a74c-776e204f81c6 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:03.562673234 +0000 UTC m=+38.689677038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert") pod "ingress-canary-8969w" (UID: "0721f5e7-8efa-45d5-a74c-776e204f81c6") : secret "canary-serving-cert" not found Apr 23 08:13:01.701543 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:01.701513 2579 generic.go:358] "Generic (PLEG): container finished" podID="325bc232-cf8c-46f7-a278-679124fa4e09" containerID="fc73b07acb814affdc5cbc81598567f6674cf5abb4c35a15732e4851af0f978f" exitCode=0 Apr 23 08:13:01.701870 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:01.701561 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h756m" event={"ID":"325bc232-cf8c-46f7-a278-679124fa4e09","Type":"ContainerDied","Data":"fc73b07acb814affdc5cbc81598567f6674cf5abb4c35a15732e4851af0f978f"} Apr 23 08:13:02.705785 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:02.705569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h756m" event={"ID":"325bc232-cf8c-46f7-a278-679124fa4e09","Type":"ContainerStarted","Data":"f8469749b8faa9854c26ddfd9254af6b6e8733eb4ce1e1a84108df7897176b95"} Apr 23 08:13:02.730729 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:02.730680 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-h756m" podStartSLOduration=6.324324597 podStartE2EDuration="37.730666998s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:12:28.251023396 +0000 UTC m=+3.378027205" lastFinishedPulling="2026-04-23 08:12:59.657365802 +0000 UTC m=+34.784369606" observedRunningTime="2026-04-23 08:13:02.729313822 +0000 UTC m=+37.856317648" watchObservedRunningTime="2026-04-23 08:13:02.730666998 +0000 UTC m=+37.857670854" Apr 23 08:13:03.478175 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:03.478138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:13:03.478368 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:03.478202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:13:03.478368 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:03.478300 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:03.478368 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:03.478320 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:13:03.478368 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:03.478331 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84ff9c579b-q6lss: secret "image-registry-tls" not found Apr 23 08:13:03.478561 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:03.478373 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls podName:b1a03cf0-a55d-4e77-9ca0-a33f942b3b24 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:07.478354289 +0000 UTC m=+42.605358094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls") pod "dns-default-s2qrh" (UID: "b1a03cf0-a55d-4e77-9ca0-a33f942b3b24") : secret "dns-default-metrics-tls" not found Apr 23 08:13:03.478561 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:03.478392 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls podName:7580ab91-3a6f-405a-b1b4-402044f4cd59 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:07.478383686 +0000 UTC m=+42.605387489 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls") pod "image-registry-84ff9c579b-q6lss" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59") : secret "image-registry-tls" not found Apr 23 08:13:03.578761 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:03.578722 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:03.578901 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:03.578821 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:03.578901 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:03.578865 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert podName:0721f5e7-8efa-45d5-a74c-776e204f81c6 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:07.578852473 +0000 UTC m=+42.705856277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert") pod "ingress-canary-8969w" (UID: "0721f5e7-8efa-45d5-a74c-776e204f81c6") : secret "canary-serving-cert" not found Apr 23 08:13:05.088463 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:05.088428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:13:05.091717 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:05.091696 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98eee7d8-bae0-438a-887a-8591437a310c-original-pull-secret\") pod \"global-pull-secret-syncer-x8j9k\" (UID: \"98eee7d8-bae0-438a-887a-8591437a310c\") " pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:13:05.347755 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:05.347659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x8j9k" Apr 23 08:13:05.470847 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:05.470820 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x8j9k"] Apr 23 08:13:05.474045 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:13:05.474016 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98eee7d8_bae0_438a_887a_8591437a310c.slice/crio-e4a42f14d3683f5ad37d2ec30bb69374f90d4d2b04b24d2c6bf270e0dadf3d49 WatchSource:0}: Error finding container e4a42f14d3683f5ad37d2ec30bb69374f90d4d2b04b24d2c6bf270e0dadf3d49: Status 404 returned error can't find the container with id e4a42f14d3683f5ad37d2ec30bb69374f90d4d2b04b24d2c6bf270e0dadf3d49 Apr 23 08:13:05.712342 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:05.712306 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x8j9k" event={"ID":"98eee7d8-bae0-438a-887a-8591437a310c","Type":"ContainerStarted","Data":"e4a42f14d3683f5ad37d2ec30bb69374f90d4d2b04b24d2c6bf270e0dadf3d49"} Apr 23 08:13:07.506108 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:07.506052 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:13:07.506461 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:07.506126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:13:07.506461 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:07.506206 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:13:07.506461 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:07.506212 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:07.506461 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:07.506272 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls podName:b1a03cf0-a55d-4e77-9ca0-a33f942b3b24 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:15.506258387 +0000 UTC m=+50.633262190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls") pod "dns-default-s2qrh" (UID: "b1a03cf0-a55d-4e77-9ca0-a33f942b3b24") : secret "dns-default-metrics-tls" not found Apr 23 08:13:07.506461 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:07.506216 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84ff9c579b-q6lss: secret "image-registry-tls" not found Apr 23 08:13:07.506461 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:07.506341 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls podName:7580ab91-3a6f-405a-b1b4-402044f4cd59 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:15.506325627 +0000 UTC m=+50.633329432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls") pod "image-registry-84ff9c579b-q6lss" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59") : secret "image-registry-tls" not found Apr 23 08:13:07.606861 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:07.606829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:07.607026 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:07.606962 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:07.607091 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:07.607042 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert podName:0721f5e7-8efa-45d5-a74c-776e204f81c6 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:15.607027873 +0000 UTC m=+50.734031682 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert") pod "ingress-canary-8969w" (UID: "0721f5e7-8efa-45d5-a74c-776e204f81c6") : secret "canary-serving-cert" not found Apr 23 08:13:10.722194 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:10.722159 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x8j9k" event={"ID":"98eee7d8-bae0-438a-887a-8591437a310c","Type":"ContainerStarted","Data":"a12b0af745859ec02e0f56b4283594ef1331c922bfb4f1676372841c01eb348c"} Apr 23 08:13:10.736901 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:10.736857 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x8j9k" podStartSLOduration=32.984336243 podStartE2EDuration="37.736844598s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:13:05.47552125 +0000 UTC m=+40.602525055" lastFinishedPulling="2026-04-23 08:13:10.228029606 +0000 UTC m=+45.355033410" observedRunningTime="2026-04-23 08:13:10.73651888 +0000 UTC m=+45.863522709" watchObservedRunningTime="2026-04-23 08:13:10.736844598 +0000 UTC m=+45.863848420" Apr 23 08:13:15.570355 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:15.570322 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:13:15.570764 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:15.570383 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:13:15.570764 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:15.570457 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:15.570764 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:15.570461 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:13:15.570764 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:15.570477 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84ff9c579b-q6lss: secret "image-registry-tls" not found Apr 23 08:13:15.570764 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:15.570506 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls podName:b1a03cf0-a55d-4e77-9ca0-a33f942b3b24 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:31.570491232 +0000 UTC m=+66.697495035 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls") pod "dns-default-s2qrh" (UID: "b1a03cf0-a55d-4e77-9ca0-a33f942b3b24") : secret "dns-default-metrics-tls" not found Apr 23 08:13:15.570764 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:15.570525 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls podName:7580ab91-3a6f-405a-b1b4-402044f4cd59 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:31.570511889 +0000 UTC m=+66.697515693 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls") pod "image-registry-84ff9c579b-q6lss" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59") : secret "image-registry-tls" not found Apr 23 08:13:15.671690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:15.671655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:15.671850 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:15.671796 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:15.671889 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:15.671880 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert podName:0721f5e7-8efa-45d5-a74c-776e204f81c6 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:31.671864614 +0000 UTC m=+66.798868417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert") pod "ingress-canary-8969w" (UID: "0721f5e7-8efa-45d5-a74c-776e204f81c6") : secret "canary-serving-cert" not found Apr 23 08:13:24.692867 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:24.692838 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svpkl" Apr 23 08:13:31.184672 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.184637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:13:31.187236 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.187218 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:13:31.195136 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.195120 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:13:31.195190 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.195180 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs podName:41fa5042-9289-494c-9973-953c5146e01c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:35.195159982 +0000 UTC m=+130.322163786 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs") pod "network-metrics-daemon-d5mzc" (UID: "41fa5042-9289-494c-9973-953c5146e01c") : secret "metrics-daemon-secret" not found Apr 23 08:13:31.285672 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.285641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:13:31.288264 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.288248 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:13:31.298333 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.298318 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:13:31.308869 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.308844 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdll\" (UniqueName: \"kubernetes.io/projected/7aedc835-f4c8-4970-ba3b-30679f2aa6e9-kube-api-access-7pdll\") pod \"network-check-target-c4bd2\" (UID: \"7aedc835-f4c8-4970-ba3b-30679f2aa6e9\") " pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:13:31.453993 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.453917 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mdjsc\"" Apr 23 08:13:31.461536 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.461522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:13:31.573400 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.573369 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-c4bd2"] Apr 23 08:13:31.577044 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:13:31.577011 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aedc835_f4c8_4970_ba3b_30679f2aa6e9.slice/crio-d55e526a123b4e8f845dbc349293d38b17ca0f7e7c702a5a30f9c272ba66aaf1 WatchSource:0}: Error finding container d55e526a123b4e8f845dbc349293d38b17ca0f7e7c702a5a30f9c272ba66aaf1: Status 404 returned error can't find the container with id d55e526a123b4e8f845dbc349293d38b17ca0f7e7c702a5a30f9c272ba66aaf1 Apr 23 08:13:31.587276 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.587252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:13:31.587356 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.587314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:13:31.587413 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.587407 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:31.587458 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.587406 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:13:31.587458 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.587425 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84ff9c579b-q6lss: secret "image-registry-tls" not found Apr 23 08:13:31.587531 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.587458 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls podName:b1a03cf0-a55d-4e77-9ca0-a33f942b3b24 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:03.587446006 +0000 UTC m=+98.714449810 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls") pod "dns-default-s2qrh" (UID: "b1a03cf0-a55d-4e77-9ca0-a33f942b3b24") : secret "dns-default-metrics-tls" not found Apr 23 08:13:31.587531 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.587490 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls podName:7580ab91-3a6f-405a-b1b4-402044f4cd59 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:03.587471183 +0000 UTC m=+98.714474990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls") pod "image-registry-84ff9c579b-q6lss" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59") : secret "image-registry-tls" not found Apr 23 08:13:31.687815 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.687780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:13:31.687963 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.687892 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:31.687963 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:31.687945 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert podName:0721f5e7-8efa-45d5-a74c-776e204f81c6 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:03.687930242 +0000 UTC m=+98.814934049 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert") pod "ingress-canary-8969w" (UID: "0721f5e7-8efa-45d5-a74c-776e204f81c6") : secret "canary-serving-cert" not found Apr 23 08:13:31.759867 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:31.759781 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-c4bd2" event={"ID":"7aedc835-f4c8-4970-ba3b-30679f2aa6e9","Type":"ContainerStarted","Data":"d55e526a123b4e8f845dbc349293d38b17ca0f7e7c702a5a30f9c272ba66aaf1"} Apr 23 08:13:34.767562 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:34.767520 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-c4bd2" event={"ID":"7aedc835-f4c8-4970-ba3b-30679f2aa6e9","Type":"ContainerStarted","Data":"1c978a46f3b61504ac5d7f3d412cc1fb45a53d840441a5bc575e9a78a8476632"} Apr 23 08:13:34.767942 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:34.767648 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:13:34.783792 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:34.783746 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-c4bd2" podStartSLOduration=67.121657196 podStartE2EDuration="1m9.783732247s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:13:31.578825858 +0000 UTC m=+66.705829662" lastFinishedPulling="2026-04-23 08:13:34.240900906 +0000 UTC m=+69.367904713" observedRunningTime="2026-04-23 08:13:34.782523518 +0000 UTC m=+69.909527344" watchObservedRunningTime="2026-04-23 08:13:34.783732247 +0000 UTC m=+69.910736073" Apr 23 08:13:59.410912 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.410879 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4"] Apr 23 08:13:59.415232 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.415213 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-64566ff68-kfxkd"] Apr 23 08:13:59.415370 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.415352 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.419197 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.419162 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 08:13:59.419197 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.419189 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:13:59.419368 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.419251 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-8wvhb\"" Apr 23 08:13:59.421333 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.421315 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 08:13:59.421439 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.421342 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:13:59.421499 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.421483 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.423717 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.423700 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4"] Apr 23 08:13:59.423811 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.423796 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 08:13:59.423979 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.423958 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 08:13:59.424690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.424216 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 08:13:59.424690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.424427 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 08:13:59.424690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.424441 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 08:13:59.424690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.424544 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6x7wn\"" Apr 23 08:13:59.424690 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.424670 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 08:13:59.440166 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.440144 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-64566ff68-kfxkd"] Apr 23 08:13:59.479576 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.479555 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.479672 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.479589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.479672 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.479657 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-default-certificate\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.479746 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.479684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-stats-auth\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.479746 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.479707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4wz6\" (UniqueName: \"kubernetes.io/projected/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-kube-api-access-q4wz6\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.479746 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.479730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.479838 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.479747 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.479838 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.479762 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jfmk\" (UniqueName: \"kubernetes.io/projected/8d015a9d-342c-4585-8306-d78b5774129d-kube-api-access-4jfmk\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.517331 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.517304 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw"] Apr 23 08:13:59.520035 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.520021 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-n42s5"] Apr 23 08:13:59.520177 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.520162 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:13:59.522531 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.522512 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 08:13:59.522599 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.522549 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:13:59.522599 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.522563 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hpw7m\"" Apr 23 08:13:59.522881 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.522865 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 08:13:59.522920 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.522910 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.528354 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.528338 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-m4twr\"" Apr 23 08:13:59.529135 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.529118 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 08:13:59.529593 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.529575 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 08:13:59.530413 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.530376 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:13:59.530559 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.530467 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:13:59.536646 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.536628 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 08:13:59.538011 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.537991 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-n42s5"] Apr 23 08:13:59.538755 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.538731 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw"] Apr 23 08:13:59.580253 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580227 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6fpv\" (UniqueName: \"kubernetes.io/projected/7318baba-3bf1-4121-9683-c65da37bc5ee-kube-api-access-h6fpv\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:13:59.580367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.580367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.580367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580307 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:13:59.580367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580326 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.580367 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.580611 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580383 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4wz6\" (UniqueName: \"kubernetes.io/projected/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-kube-api-access-q4wz6\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.580611 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:59.580386 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:13:59.580611 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z595\" (UniqueName: \"kubernetes.io/projected/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-kube-api-access-6z595\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.580611 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jfmk\" (UniqueName: \"kubernetes.io/projected/8d015a9d-342c-4585-8306-d78b5774129d-kube-api-access-4jfmk\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.580611 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:59.580442 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls podName:1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a nodeName:}" failed. No retries permitted until 2026-04-23 08:14:00.080422533 +0000 UTC m=+95.207426338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xvxq4" (UID: "1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:13:59.580611 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-serving-cert\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.580611 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:59.580498 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:13:59.580611 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:59.580582 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:00.080564505 +0000 UTC m=+95.207568323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : secret "router-metrics-certs-default" not found Apr 23 08:13:59.580953 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580631 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-tmp\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.580953 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580664 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-default-certificate\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.580953 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580702 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-stats-auth\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.580953 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-snapshots\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.580953 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-service-ca-bundle\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.580953 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.580911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.581198 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:59.581041 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:00.081029327 +0000 UTC m=+95.208033131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : configmap references non-existent config key: service-ca.crt Apr 23 08:13:59.581483 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.581468 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.583019 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.582998 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-stats-auth\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.585083 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.583360 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-default-certificate\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.589868 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.589846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jfmk\" (UniqueName: \"kubernetes.io/projected/8d015a9d-342c-4585-8306-d78b5774129d-kube-api-access-4jfmk\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:13:59.589946 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.589898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4wz6\" (UniqueName: \"kubernetes.io/projected/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-kube-api-access-q4wz6\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:13:59.681946 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.681874 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-snapshots\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.681946 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.681920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-service-ca-bundle\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.682150 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6fpv\" (UniqueName: \"kubernetes.io/projected/7318baba-3bf1-4121-9683-c65da37bc5ee-kube-api-access-h6fpv\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:13:59.682150 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682113 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:13:59.682150 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.682274 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682176 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z595\" (UniqueName: \"kubernetes.io/projected/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-kube-api-access-6z595\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.682274 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:59.682251 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:13:59.682373 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:13:59.682310 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls podName:7318baba-3bf1-4121-9683-c65da37bc5ee nodeName:}" failed. No retries permitted until 2026-04-23 08:14:00.182294055 +0000 UTC m=+95.309297860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7xpw" (UID: "7318baba-3bf1-4121-9683-c65da37bc5ee") : secret "samples-operator-tls" not found Apr 23 08:13:59.682373 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-serving-cert\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.682472 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-tmp\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.682472 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-service-ca-bundle\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.682624 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682602 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-snapshots\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.682670 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-tmp\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.682930 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.682912 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.684421 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.684406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-serving-cert\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.690434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.690411 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6fpv\" (UniqueName: \"kubernetes.io/projected/7318baba-3bf1-4121-9683-c65da37bc5ee-kube-api-access-h6fpv\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:13:59.690947 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.690931 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z595\" (UniqueName: \"kubernetes.io/projected/1a2fda67-d89f-4fa0-a61a-91dfc33b57fd-kube-api-access-6z595\") pod \"insights-operator-585dfdc468-n42s5\" (UID: \"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd\") " pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.834738 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.834706 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-n42s5" Apr 23 08:13:59.946387 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:13:59.946310 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-n42s5"] Apr 23 08:13:59.950627 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:13:59.950601 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a2fda67_d89f_4fa0_a61a_91dfc33b57fd.slice/crio-b9b943f093a96398035bd69af306b76d5c81c2d1500cc688ab60fff02777a22d WatchSource:0}: Error finding container b9b943f093a96398035bd69af306b76d5c81c2d1500cc688ab60fff02777a22d: Status 404 returned error can't find the container with id b9b943f093a96398035bd69af306b76d5c81c2d1500cc688ab60fff02777a22d Apr 23 08:14:00.085829 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:00.085801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:14:00.085977 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:00.085848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:00.085977 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:00.085939 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:00.086128 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:00.085991 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:14:00.086128 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:00.085998 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls podName:1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a nodeName:}" failed. No retries permitted until 2026-04-23 08:14:01.085980774 +0000 UTC m=+96.212984578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xvxq4" (UID: "1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:00.086128 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:00.086030 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:01.086018864 +0000 UTC m=+96.213022668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : secret "router-metrics-certs-default" not found Apr 23 08:14:00.086256 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:00.086133 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:00.086256 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:00.086222 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:01.086215298 +0000 UTC m=+96.213219102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : configmap references non-existent config key: service-ca.crt Apr 23 08:14:00.186909 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:00.186880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:14:00.187011 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:00.186994 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:14:00.187091 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:00.187045 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls podName:7318baba-3bf1-4121-9683-c65da37bc5ee nodeName:}" failed. No retries permitted until 2026-04-23 08:14:01.187032246 +0000 UTC m=+96.314036049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7xpw" (UID: "7318baba-3bf1-4121-9683-c65da37bc5ee") : secret "samples-operator-tls" not found Apr 23 08:14:00.818732 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:00.818690 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-n42s5" event={"ID":"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd","Type":"ContainerStarted","Data":"b9b943f093a96398035bd69af306b76d5c81c2d1500cc688ab60fff02777a22d"} Apr 23 08:14:01.094625 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:01.094543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:01.094625 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:01.094614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:14:01.094839 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:01.094666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:01.094839 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:01.094731 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:03.094710141 +0000 UTC m=+98.221713945 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : configmap references non-existent config key: service-ca.crt Apr 23 08:14:01.094839 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:01.094782 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:01.094839 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:01.094791 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:14:01.095024 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:01.094852 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:03.094837104 +0000 UTC m=+98.221840911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : secret "router-metrics-certs-default" not found Apr 23 08:14:01.095024 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:01.094870 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls podName:1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a nodeName:}" failed. No retries permitted until 2026-04-23 08:14:03.094860788 +0000 UTC m=+98.221864593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xvxq4" (UID: "1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:01.195541 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:01.195506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:14:01.195707 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:01.195665 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:14:01.195754 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:01.195742 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls podName:7318baba-3bf1-4121-9683-c65da37bc5ee nodeName:}" failed. No retries permitted until 2026-04-23 08:14:03.195721826 +0000 UTC m=+98.322725653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7xpw" (UID: "7318baba-3bf1-4121-9683-c65da37bc5ee") : secret "samples-operator-tls" not found Apr 23 08:14:02.824285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:02.824247 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-n42s5" event={"ID":"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd","Type":"ContainerStarted","Data":"cd417f62c894e9b362d222223a9e16abab67ee58b0907d3f0d024d97f3af6bc0"} Apr 23 08:14:02.841945 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:02.841890 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-n42s5" podStartSLOduration=1.460904325 podStartE2EDuration="3.841871871s" podCreationTimestamp="2026-04-23 08:13:59 +0000 UTC" firstStartedPulling="2026-04-23 08:13:59.952278494 +0000 UTC m=+95.079282297" lastFinishedPulling="2026-04-23 08:14:02.333246039 +0000 UTC m=+97.460249843" observedRunningTime="2026-04-23 08:14:02.8398331 +0000 UTC m=+97.966836926" watchObservedRunningTime="2026-04-23 08:14:02.841871871 +0000 UTC m=+97.968875699" Apr 23 08:14:03.111759 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:03.111666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:03.111759 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:03.111722 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:14:03.111759 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:03.111752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:03.112021 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.111826 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:14:03.112021 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.111831 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:07.111810448 +0000 UTC m=+102.238814254 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : configmap references non-existent config key: service-ca.crt Apr 23 08:14:03.112021 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.111873 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:07.111860157 +0000 UTC m=+102.238863961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : secret "router-metrics-certs-default" not found Apr 23 08:14:03.112021 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.111878 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:03.112021 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.111910 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls podName:1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a nodeName:}" failed. No retries permitted until 2026-04-23 08:14:07.111902484 +0000 UTC m=+102.238906288 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xvxq4" (UID: "1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:03.212672 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:03.212633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:14:03.213020 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.212941 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:14:03.213242 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.213230 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls podName:7318baba-3bf1-4121-9683-c65da37bc5ee nodeName:}" failed. No retries permitted until 2026-04-23 08:14:07.21320311 +0000 UTC m=+102.340206917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7xpw" (UID: "7318baba-3bf1-4121-9683-c65da37bc5ee") : secret "samples-operator-tls" not found Apr 23 08:14:03.615636 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:03.615604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") pod \"image-registry-84ff9c579b-q6lss\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:14:03.615774 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:03.615671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:14:03.615774 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.615756 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:03.615774 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.615763 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:03.615885 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.615780 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84ff9c579b-q6lss: secret "image-registry-tls" not found Apr 23 08:14:03.615885 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.615804 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls podName:b1a03cf0-a55d-4e77-9ca0-a33f942b3b24 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:07.615791861 +0000 UTC m=+162.742795665 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls") pod "dns-default-s2qrh" (UID: "b1a03cf0-a55d-4e77-9ca0-a33f942b3b24") : secret "dns-default-metrics-tls" not found Apr 23 08:14:03.615885 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.615835 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls podName:7580ab91-3a6f-405a-b1b4-402044f4cd59 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:07.615821028 +0000 UTC m=+162.742824832 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls") pod "image-registry-84ff9c579b-q6lss" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59") : secret "image-registry-tls" not found Apr 23 08:14:03.716852 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:03.716812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:14:03.716992 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.716948 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:03.717049 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:03.717003 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert podName:0721f5e7-8efa-45d5-a74c-776e204f81c6 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:07.716989044 +0000 UTC m=+162.843992851 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert") pod "ingress-canary-8969w" (UID: "0721f5e7-8efa-45d5-a74c-776e204f81c6") : secret "canary-serving-cert" not found Apr 23 08:14:05.209474 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:05.209444 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hnjm9_cf4bc19e-bb95-4e2e-9978-4a53f064696d/dns-node-resolver/0.log" Apr 23 08:14:05.772265 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:05.772236 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-c4bd2" Apr 23 08:14:06.008850 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:06.008822 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9b2bf_10f6f586-998e-4725-bb51-e801aba526fe/node-ca/0.log" Apr 23 08:14:07.142744 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.142712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:07.143174 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.142774 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:14:07.143174 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.142836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:07.143174 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:07.142888 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:15.142869313 +0000 UTC m=+110.269873120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : configmap references non-existent config key: service-ca.crt Apr 23 08:14:07.143174 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:07.142928 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:14:07.143174 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:07.142931 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:07.143174 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:07.142967 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:15.142956598 +0000 UTC m=+110.269960415 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : secret "router-metrics-certs-default" not found Apr 23 08:14:07.143489 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:07.143349 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls podName:1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a nodeName:}" failed. No retries permitted until 2026-04-23 08:14:15.143312113 +0000 UTC m=+110.270315918 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xvxq4" (UID: "1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:07.243259 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.243231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:14:07.243402 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:07.243333 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:14:07.243402 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:07.243382 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls podName:7318baba-3bf1-4121-9683-c65da37bc5ee nodeName:}" failed. No retries permitted until 2026-04-23 08:14:15.243369813 +0000 UTC m=+110.370373618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7xpw" (UID: "7318baba-3bf1-4121-9683-c65da37bc5ee") : secret "samples-operator-tls" not found Apr 23 08:14:07.503716 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.503645 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x"] Apr 23 08:14:07.506438 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.506423 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" Apr 23 08:14:07.508836 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.508816 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:07.508836 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.508824 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-wm486\"" Apr 23 08:14:07.509806 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.509791 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 08:14:07.520527 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.520505 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x"] Apr 23 08:14:07.646335 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.646307 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwkjr\" (UniqueName: \"kubernetes.io/projected/88352a02-c1be-4e1c-9b7a-9e7009f4911c-kube-api-access-nwkjr\") pod \"migrator-74bb7799d9-xxn8x\" (UID: \"88352a02-c1be-4e1c-9b7a-9e7009f4911c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" Apr 23 08:14:07.747277 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.747246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwkjr\" (UniqueName: \"kubernetes.io/projected/88352a02-c1be-4e1c-9b7a-9e7009f4911c-kube-api-access-nwkjr\") pod \"migrator-74bb7799d9-xxn8x\" (UID: \"88352a02-c1be-4e1c-9b7a-9e7009f4911c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" Apr 23 08:14:07.755375 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.755324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwkjr\" (UniqueName: \"kubernetes.io/projected/88352a02-c1be-4e1c-9b7a-9e7009f4911c-kube-api-access-nwkjr\") pod \"migrator-74bb7799d9-xxn8x\" (UID: \"88352a02-c1be-4e1c-9b7a-9e7009f4911c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" Apr 23 08:14:07.815468 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.815442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" Apr 23 08:14:07.928951 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:07.928922 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x"] Apr 23 08:14:07.932794 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:07.932763 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88352a02_c1be_4e1c_9b7a_9e7009f4911c.slice/crio-5bfe8c6124d64baba4c043dbba2138833433bd0731420d6b624eec780a05d361 WatchSource:0}: Error finding container 5bfe8c6124d64baba4c043dbba2138833433bd0731420d6b624eec780a05d361: Status 404 returned error can't find the container with id 5bfe8c6124d64baba4c043dbba2138833433bd0731420d6b624eec780a05d361 Apr 23 08:14:08.837727 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:08.837695 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" event={"ID":"88352a02-c1be-4e1c-9b7a-9e7009f4911c","Type":"ContainerStarted","Data":"5bfe8c6124d64baba4c043dbba2138833433bd0731420d6b624eec780a05d361"} Apr 23 08:14:09.842279 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:09.842170 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" event={"ID":"88352a02-c1be-4e1c-9b7a-9e7009f4911c","Type":"ContainerStarted","Data":"dc9c60ce6a92735f9fa63b97f359cc4898e1573c4d8c1e7e7e46cd929c5309f5"} Apr 23 08:14:09.842279 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:09.842229 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" event={"ID":"88352a02-c1be-4e1c-9b7a-9e7009f4911c","Type":"ContainerStarted","Data":"9423852c164b2464ecd072da4e568252c9f73329738f04b9a1f591b3414ece42"} Apr 23 08:14:09.857910 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:09.857861 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xxn8x" podStartSLOduration=1.396905515 podStartE2EDuration="2.857847873s" podCreationTimestamp="2026-04-23 08:14:07 +0000 UTC" firstStartedPulling="2026-04-23 08:14:07.934469702 +0000 UTC m=+103.061473518" lastFinishedPulling="2026-04-23 08:14:09.395412071 +0000 UTC m=+104.522415876" observedRunningTime="2026-04-23 08:14:09.85769188 +0000 UTC m=+104.984695707" watchObservedRunningTime="2026-04-23 08:14:09.857847873 +0000 UTC m=+104.984851699" Apr 23 08:14:15.204201 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:15.204166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:15.204570 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:15.204218 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:14:15.204570 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:15.204309 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:15.204570 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:15.204323 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:31.204304539 +0000 UTC m=+126.331308346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : configmap references non-existent config key: service-ca.crt Apr 23 08:14:15.204570 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:15.204352 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls podName:1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a nodeName:}" failed. No retries permitted until 2026-04-23 08:14:31.204340786 +0000 UTC m=+126.331344590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xvxq4" (UID: "1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:14:15.204570 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:15.204374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:15.204570 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:15.204445 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:14:15.204570 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:15.204475 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs podName:8d015a9d-342c-4585-8306-d78b5774129d nodeName:}" failed. No retries permitted until 2026-04-23 08:14:31.204468054 +0000 UTC m=+126.331471859 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs") pod "router-default-64566ff68-kfxkd" (UID: "8d015a9d-342c-4585-8306-d78b5774129d") : secret "router-metrics-certs-default" not found Apr 23 08:14:15.304958 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:15.304921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:14:15.307327 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:15.307305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7318baba-3bf1-4121-9683-c65da37bc5ee-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7xpw\" (UID: \"7318baba-3bf1-4121-9683-c65da37bc5ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:14:15.429751 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:15.429726 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" Apr 23 08:14:15.544667 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:15.544638 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw"] Apr 23 08:14:15.856492 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:15.856402 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" event={"ID":"7318baba-3bf1-4121-9683-c65da37bc5ee","Type":"ContainerStarted","Data":"365fb37b850e25c2c68576c4d738429d485dd17d8908f9e34043e88bda9ee4b4"} Apr 23 08:14:17.863644 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:17.863558 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" event={"ID":"7318baba-3bf1-4121-9683-c65da37bc5ee","Type":"ContainerStarted","Data":"49a4e415c4b57f7992fe3c9783099e176c6cdcb3a0d2e7c5b6798430d68c5263"} Apr 23 08:14:17.863644 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:17.863594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" event={"ID":"7318baba-3bf1-4121-9683-c65da37bc5ee","Type":"ContainerStarted","Data":"b9ae6cfd2ef8984a5b8e11ffe3f1f84432568f0525338d90577e8489d330eeb6"} Apr 23 08:14:17.881506 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:17.881464 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7xpw" podStartSLOduration=17.069915299 podStartE2EDuration="18.881448766s" podCreationTimestamp="2026-04-23 08:13:59 +0000 UTC" firstStartedPulling="2026-04-23 08:14:15.583598769 +0000 UTC m=+110.710602574" lastFinishedPulling="2026-04-23 08:14:17.395132235 +0000 UTC m=+112.522136041" observedRunningTime="2026-04-23 08:14:17.881089141 +0000 UTC m=+113.008092967" watchObservedRunningTime="2026-04-23 08:14:17.881448766 +0000 UTC m=+113.008452592" Apr 23 08:14:31.247341 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.247303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:31.247879 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.247364 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:14:31.247879 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.247399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:31.247977 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.247947 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d015a9d-342c-4585-8306-d78b5774129d-service-ca-bundle\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:31.249733 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.249710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d015a9d-342c-4585-8306-d78b5774129d-metrics-certs\") pod \"router-default-64566ff68-kfxkd\" (UID: \"8d015a9d-342c-4585-8306-d78b5774129d\") " pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:31.249906 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.249885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xvxq4\" (UID: \"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:14:31.531907 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.531819 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-8wvhb\"" Apr 23 08:14:31.536203 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.536185 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6x7wn\"" Apr 23 08:14:31.539395 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.539376 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" Apr 23 08:14:31.544960 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.544942 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:31.660362 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.660337 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4"] Apr 23 08:14:31.663450 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:31.663427 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ce3ff11_bba5_4c93_b9a5_9da3915e6c2a.slice/crio-853cd5aa4217d00771d738b38fee2e14de1c89a71ed429f9b564eeba72b02a8b WatchSource:0}: Error finding container 853cd5aa4217d00771d738b38fee2e14de1c89a71ed429f9b564eeba72b02a8b: Status 404 returned error can't find the container with id 853cd5aa4217d00771d738b38fee2e14de1c89a71ed429f9b564eeba72b02a8b Apr 23 08:14:31.677072 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.677017 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-64566ff68-kfxkd"] Apr 23 08:14:31.680919 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:31.680895 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d015a9d_342c_4585_8306_d78b5774129d.slice/crio-436e20ac1a1424a04ff1ee757079c31f97007db4c6bfd23a2433a34448618cc8 WatchSource:0}: Error finding container 436e20ac1a1424a04ff1ee757079c31f97007db4c6bfd23a2433a34448618cc8: Status 404 returned error can't find the container with id 436e20ac1a1424a04ff1ee757079c31f97007db4c6bfd23a2433a34448618cc8 Apr 23 08:14:31.897194 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.897162 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" event={"ID":"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a","Type":"ContainerStarted","Data":"853cd5aa4217d00771d738b38fee2e14de1c89a71ed429f9b564eeba72b02a8b"} Apr 23 08:14:31.898304 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.898282 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-64566ff68-kfxkd" event={"ID":"8d015a9d-342c-4585-8306-d78b5774129d","Type":"ContainerStarted","Data":"05c9465d2cd4177bd72620a05f635861c5c3a38a176ccc1a3d66b9f42c75672b"} Apr 23 08:14:31.898398 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.898308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-64566ff68-kfxkd" event={"ID":"8d015a9d-342c-4585-8306-d78b5774129d","Type":"ContainerStarted","Data":"436e20ac1a1424a04ff1ee757079c31f97007db4c6bfd23a2433a34448618cc8"} Apr 23 08:14:31.930092 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.927853 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-64566ff68-kfxkd" podStartSLOduration=32.927831525 podStartE2EDuration="32.927831525s" podCreationTimestamp="2026-04-23 08:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:31.925715519 +0000 UTC m=+127.052719345" watchObservedRunningTime="2026-04-23 08:14:31.927831525 +0000 UTC m=+127.054835352" Apr 23 08:14:31.952811 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.952782 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5kw49"] Apr 23 08:14:31.955936 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.955917 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:31.960193 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.960154 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:14:31.960193 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.960179 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vtw6x\"" Apr 23 08:14:31.960392 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.960259 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:14:31.972839 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:31.972820 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5kw49"] Apr 23 08:14:32.054228 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.054200 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbgmc\" (UniqueName: \"kubernetes.io/projected/8800717e-dec3-48fd-8358-621330743c4c-kube-api-access-lbgmc\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.054365 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.054241 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8800717e-dec3-48fd-8358-621330743c4c-crio-socket\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.054365 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.054267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8800717e-dec3-48fd-8358-621330743c4c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.054365 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.054300 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8800717e-dec3-48fd-8358-621330743c4c-data-volume\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.054365 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.054338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8800717e-dec3-48fd-8358-621330743c4c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.155243 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.155171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbgmc\" (UniqueName: \"kubernetes.io/projected/8800717e-dec3-48fd-8358-621330743c4c-kube-api-access-lbgmc\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.155243 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.155214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8800717e-dec3-48fd-8358-621330743c4c-crio-socket\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.155412 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.155274 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8800717e-dec3-48fd-8358-621330743c4c-crio-socket\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.155412 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.155301 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8800717e-dec3-48fd-8358-621330743c4c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.155412 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.155343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8800717e-dec3-48fd-8358-621330743c4c-data-volume\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.155526 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.155415 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8800717e-dec3-48fd-8358-621330743c4c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.155716 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.155699 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8800717e-dec3-48fd-8358-621330743c4c-data-volume\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.155862 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.155846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8800717e-dec3-48fd-8358-621330743c4c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.157806 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.157785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8800717e-dec3-48fd-8358-621330743c4c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.163049 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.163030 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbgmc\" (UniqueName: \"kubernetes.io/projected/8800717e-dec3-48fd-8358-621330743c4c-kube-api-access-lbgmc\") pod \"insights-runtime-extractor-5kw49\" (UID: \"8800717e-dec3-48fd-8358-621330743c4c\") " pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.264175 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.264149 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5kw49" Apr 23 08:14:32.376443 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.376412 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5kw49"] Apr 23 08:14:32.379364 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:32.379336 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8800717e_dec3_48fd_8358_621330743c4c.slice/crio-0914bff0bdbe7b7eef72c170c34352dd6034340c5d5df596bad6150ec7a70aed WatchSource:0}: Error finding container 0914bff0bdbe7b7eef72c170c34352dd6034340c5d5df596bad6150ec7a70aed: Status 404 returned error can't find the container with id 0914bff0bdbe7b7eef72c170c34352dd6034340c5d5df596bad6150ec7a70aed Apr 23 08:14:32.545423 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.545390 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:32.548299 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.548277 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:32.902862 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.902825 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5kw49" event={"ID":"8800717e-dec3-48fd-8358-621330743c4c","Type":"ContainerStarted","Data":"46b25a7dad3397fad0d8674672be160e999d71d99e850e6bbb23338123b7fc4a"} Apr 23 08:14:32.902862 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.902864 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5kw49" event={"ID":"8800717e-dec3-48fd-8358-621330743c4c","Type":"ContainerStarted","Data":"0914bff0bdbe7b7eef72c170c34352dd6034340c5d5df596bad6150ec7a70aed"} Apr 23 08:14:32.903411 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.903392 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:32.904706 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:32.904682 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-64566ff68-kfxkd" Apr 23 08:14:33.906958 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:33.906919 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5kw49" event={"ID":"8800717e-dec3-48fd-8358-621330743c4c","Type":"ContainerStarted","Data":"1521ff341ef2707094c7e6d045788843bc34021696a64a6d4cdc629a8124a048"} Apr 23 08:14:33.908328 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:33.908287 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" event={"ID":"1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a","Type":"ContainerStarted","Data":"457e16630e57f77945845b033207b0cfaa4228c8b3b0b0693b287fe905edde82"} Apr 23 08:14:33.926411 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:33.926341 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xvxq4" podStartSLOduration=32.851709062 podStartE2EDuration="34.926326806s" podCreationTimestamp="2026-04-23 08:13:59 +0000 UTC" firstStartedPulling="2026-04-23 08:14:31.665878194 +0000 UTC m=+126.792881999" lastFinishedPulling="2026-04-23 08:14:33.740495925 +0000 UTC m=+128.867499743" observedRunningTime="2026-04-23 08:14:33.924308199 +0000 UTC m=+129.051312027" watchObservedRunningTime="2026-04-23 08:14:33.926326806 +0000 UTC m=+129.053330633" Apr 23 08:14:34.912759 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:34.912719 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5kw49" event={"ID":"8800717e-dec3-48fd-8358-621330743c4c","Type":"ContainerStarted","Data":"04edb23895e0c14b247fa9e286bea2f3d4b4eb32ea2abf7af645661b2da83f1a"} Apr 23 08:14:34.931318 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:34.931256 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5kw49" podStartSLOduration=1.686511768 podStartE2EDuration="3.931239287s" podCreationTimestamp="2026-04-23 08:14:31 +0000 UTC" firstStartedPulling="2026-04-23 08:14:32.450249874 +0000 UTC m=+127.577253682" lastFinishedPulling="2026-04-23 08:14:34.694977391 +0000 UTC m=+129.821981201" observedRunningTime="2026-04-23 08:14:34.931094882 +0000 UTC m=+130.058098708" watchObservedRunningTime="2026-04-23 08:14:34.931239287 +0000 UTC m=+130.058243113" Apr 23 08:14:35.278764 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:35.278730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:14:35.281127 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:35.281097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41fa5042-9289-494c-9973-953c5146e01c-metrics-certs\") pod \"network-metrics-daemon-d5mzc\" (UID: \"41fa5042-9289-494c-9973-953c5146e01c\") " pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:14:35.343807 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:35.343784 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f9mps\"" Apr 23 08:14:35.352011 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:35.351995 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5mzc" Apr 23 08:14:35.501010 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:35.500982 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d5mzc"] Apr 23 08:14:35.504035 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:35.504009 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41fa5042_9289_494c_9973_953c5146e01c.slice/crio-21580b5ac3df4aea8054dd43cd2169ff2490d3e3bb778e43594860423219aaf7 WatchSource:0}: Error finding container 21580b5ac3df4aea8054dd43cd2169ff2490d3e3bb778e43594860423219aaf7: Status 404 returned error can't find the container with id 21580b5ac3df4aea8054dd43cd2169ff2490d3e3bb778e43594860423219aaf7 Apr 23 08:14:35.916262 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:35.916217 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d5mzc" event={"ID":"41fa5042-9289-494c-9973-953c5146e01c","Type":"ContainerStarted","Data":"21580b5ac3df4aea8054dd43cd2169ff2490d3e3bb778e43594860423219aaf7"} Apr 23 08:14:36.920488 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:36.920445 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d5mzc" event={"ID":"41fa5042-9289-494c-9973-953c5146e01c","Type":"ContainerStarted","Data":"80375fba047f099c66028678c2e35a621cc0b1315196c708c9f325b864fca342"} Apr 23 08:14:36.920488 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:36.920486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d5mzc" event={"ID":"41fa5042-9289-494c-9973-953c5146e01c","Type":"ContainerStarted","Data":"c684db955863be2bbe7b1861c6ae3fe9fab56c7c7313c69f218e2b1105f267c5"} Apr 23 08:14:36.936156 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:36.936109 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d5mzc" podStartSLOduration=130.964987946 podStartE2EDuration="2m11.936095176s" podCreationTimestamp="2026-04-23 08:12:25 +0000 UTC" firstStartedPulling="2026-04-23 08:14:35.505707482 +0000 UTC m=+130.632711290" lastFinishedPulling="2026-04-23 08:14:36.476814715 +0000 UTC m=+131.603818520" observedRunningTime="2026-04-23 08:14:36.935762339 +0000 UTC m=+132.062766165" watchObservedRunningTime="2026-04-23 08:14:36.936095176 +0000 UTC m=+132.063099002" Apr 23 08:14:41.640135 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.640105 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-29k85"] Apr 23 08:14:41.645541 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.645519 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.651217 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.651186 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 08:14:41.651217 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.651186 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 08:14:41.651413 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.651191 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:14:41.651413 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.651192 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-cm9vf\"" Apr 23 08:14:41.660611 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.660587 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nrm4q"] Apr 23 08:14:41.664194 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.664177 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-q6vg7"] Apr 23 08:14:41.664378 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.664359 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.666668 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.666650 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 08:14:41.666762 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.666723 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-29k85"] Apr 23 08:14:41.666831 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.666817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.666983 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.666962 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 08:14:41.667599 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.667580 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 08:14:41.668197 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.668176 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-vmnkz\"" Apr 23 08:14:41.669328 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.669311 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:14:41.669328 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.669322 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:14:41.669701 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.669675 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:14:41.669855 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.669836 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x74z6\"" Apr 23 08:14:41.677782 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.677762 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nrm4q"] Apr 23 08:14:41.725312 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.725450 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-textfile\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.725450 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.725450 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725401 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.725633 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725448 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp7vk\" (UniqueName: \"kubernetes.io/projected/11528c38-6db1-493e-b785-5d76aed4e2e9-kube-api-access-cp7vk\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.725633 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725496 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf1f6b7-9b17-42b9-808b-1e95514cc372-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.725633 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725556 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11528c38-6db1-493e-b785-5d76aed4e2e9-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.725633 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725588 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwl7h\" (UniqueName: \"kubernetes.io/projected/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-api-access-gwl7h\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.725633 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjmhg\" (UniqueName: \"kubernetes.io/projected/e93d401d-bbd4-4bac-856d-242ed681ac6e-kube-api-access-vjmhg\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.725870 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725653 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-tls\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.725870 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.725870 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725746 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-wtmp\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.725870 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725774 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93d401d-bbd4-4bac-856d-242ed681ac6e-metrics-client-ca\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.725870 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725806 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11528c38-6db1-493e-b785-5d76aed4e2e9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.725870 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725859 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/11528c38-6db1-493e-b785-5d76aed4e2e9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.726134 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725881 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7cf1f6b7-9b17-42b9-808b-1e95514cc372-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.726134 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725898 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-root\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.726134 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725920 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-sys\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.726134 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.725950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.826703 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.826703 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826697 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-wtmp\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.826898 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93d401d-bbd4-4bac-856d-242ed681ac6e-metrics-client-ca\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.826898 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826736 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11528c38-6db1-493e-b785-5d76aed4e2e9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.826898 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/11528c38-6db1-493e-b785-5d76aed4e2e9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.826898 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7cf1f6b7-9b17-42b9-808b-1e95514cc372-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.826898 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-root\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.826898 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826886 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-wtmp\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.826898 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-sys\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826946 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-sys\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e93d401d-bbd4-4bac-856d-242ed681ac6e-root\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.826995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-textfile\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp7vk\" (UniqueName: \"kubernetes.io/projected/11528c38-6db1-493e-b785-5d76aed4e2e9-kube-api-access-cp7vk\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827150 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf1f6b7-9b17-42b9-808b-1e95514cc372-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11528c38-6db1-493e-b785-5d76aed4e2e9-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwl7h\" (UniqueName: \"kubernetes.io/projected/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-api-access-gwl7h\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjmhg\" (UniqueName: \"kubernetes.io/projected/e93d401d-bbd4-4bac-856d-242ed681ac6e-kube-api-access-vjmhg\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-tls\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827297 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7cf1f6b7-9b17-42b9-808b-1e95514cc372-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.827983 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:41.827396 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 08:14:41.827983 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827400 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93d401d-bbd4-4bac-856d-242ed681ac6e-metrics-client-ca\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.827983 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:41.827481 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-tls podName:7cf1f6b7-9b17-42b9-808b-1e95514cc372 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:42.327460995 +0000 UTC m=+137.454464799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-nrm4q" (UID: "7cf1f6b7-9b17-42b9-808b-1e95514cc372") : secret "kube-state-metrics-tls" not found Apr 23 08:14:41.827983 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827841 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.827983 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.827886 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.828292 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.828154 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-textfile\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.828804 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.828762 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11528c38-6db1-493e-b785-5d76aed4e2e9-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.828911 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.828851 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf1f6b7-9b17-42b9-808b-1e95514cc372-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.829800 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.829775 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11528c38-6db1-493e-b785-5d76aed4e2e9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.830045 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.830025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.830157 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.830138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/11528c38-6db1-493e-b785-5d76aed4e2e9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.830329 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.830314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-tls\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.830386 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.830367 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93d401d-bbd4-4bac-856d-242ed681ac6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.835777 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.835751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwl7h\" (UniqueName: \"kubernetes.io/projected/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-api-access-gwl7h\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:41.835777 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.835760 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp7vk\" (UniqueName: \"kubernetes.io/projected/11528c38-6db1-493e-b785-5d76aed4e2e9-kube-api-access-cp7vk\") pod \"openshift-state-metrics-9d44df66c-29k85\" (UID: \"11528c38-6db1-493e-b785-5d76aed4e2e9\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.836203 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.836187 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjmhg\" (UniqueName: \"kubernetes.io/projected/e93d401d-bbd4-4bac-856d-242ed681ac6e-kube-api-access-vjmhg\") pod \"node-exporter-q6vg7\" (UID: \"e93d401d-bbd4-4bac-856d-242ed681ac6e\") " pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.956221 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.956155 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" Apr 23 08:14:41.981451 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:41.981419 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-q6vg7" Apr 23 08:14:41.992315 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:41.992271 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93d401d_bbd4_4bac_856d_242ed681ac6e.slice/crio-8f6dd08958e13e7b5ad42832f9d8dd4a980677bc38ef30e9f7236ae64bed0f5e WatchSource:0}: Error finding container 8f6dd08958e13e7b5ad42832f9d8dd4a980677bc38ef30e9f7236ae64bed0f5e: Status 404 returned error can't find the container with id 8f6dd08958e13e7b5ad42832f9d8dd4a980677bc38ef30e9f7236ae64bed0f5e Apr 23 08:14:42.076868 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.076834 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-29k85"] Apr 23 08:14:42.079652 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:42.079616 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11528c38_6db1_493e_b785_5d76aed4e2e9.slice/crio-587a0844948fd03abf1ea206fe3de5a59d47597648d7db06ad180975c35433c8 WatchSource:0}: Error finding container 587a0844948fd03abf1ea206fe3de5a59d47597648d7db06ad180975c35433c8: Status 404 returned error can't find the container with id 587a0844948fd03abf1ea206fe3de5a59d47597648d7db06ad180975c35433c8 Apr 23 08:14:42.332180 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.332096 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:42.334391 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.334368 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cf1f6b7-9b17-42b9-808b-1e95514cc372-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nrm4q\" (UID: \"7cf1f6b7-9b17-42b9-808b-1e95514cc372\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:42.574681 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.574644 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" Apr 23 08:14:42.729458 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.729424 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nrm4q"] Apr 23 08:14:42.733633 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:42.733598 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf1f6b7_9b17_42b9_808b_1e95514cc372.slice/crio-6d39e00104d81a4551a55feb25d2e59aa2ed076deecd05d61cafce35216a502d WatchSource:0}: Error finding container 6d39e00104d81a4551a55feb25d2e59aa2ed076deecd05d61cafce35216a502d: Status 404 returned error can't find the container with id 6d39e00104d81a4551a55feb25d2e59aa2ed076deecd05d61cafce35216a502d Apr 23 08:14:42.935926 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.935880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" event={"ID":"7cf1f6b7-9b17-42b9-808b-1e95514cc372","Type":"ContainerStarted","Data":"6d39e00104d81a4551a55feb25d2e59aa2ed076deecd05d61cafce35216a502d"} Apr 23 08:14:42.937816 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.937722 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" event={"ID":"11528c38-6db1-493e-b785-5d76aed4e2e9","Type":"ContainerStarted","Data":"c2623585500fed82211c897a4e342db82f7b44a1b5a4fe43b8a60b0b9a863d17"} Apr 23 08:14:42.937816 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.937771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" event={"ID":"11528c38-6db1-493e-b785-5d76aed4e2e9","Type":"ContainerStarted","Data":"7f0ed337517bd396a9d012607cc26130bd871c1f92b909819fea351d140cf5bb"} Apr 23 08:14:42.937816 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.937791 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" event={"ID":"11528c38-6db1-493e-b785-5d76aed4e2e9","Type":"ContainerStarted","Data":"587a0844948fd03abf1ea206fe3de5a59d47597648d7db06ad180975c35433c8"} Apr 23 08:14:42.938764 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:42.938739 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q6vg7" event={"ID":"e93d401d-bbd4-4bac-856d-242ed681ac6e","Type":"ContainerStarted","Data":"8f6dd08958e13e7b5ad42832f9d8dd4a980677bc38ef30e9f7236ae64bed0f5e"} Apr 23 08:14:43.942943 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:43.942911 2579 generic.go:358] "Generic (PLEG): container finished" podID="e93d401d-bbd4-4bac-856d-242ed681ac6e" containerID="a3d3ce8c2cc34c2bcdb1c11844e8a7a82cce512ae6f57c7306c53933645532d3" exitCode=0 Apr 23 08:14:43.943348 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:43.942988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q6vg7" event={"ID":"e93d401d-bbd4-4bac-856d-242ed681ac6e","Type":"ContainerDied","Data":"a3d3ce8c2cc34c2bcdb1c11844e8a7a82cce512ae6f57c7306c53933645532d3"} Apr 23 08:14:43.946956 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:43.946935 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" event={"ID":"11528c38-6db1-493e-b785-5d76aed4e2e9","Type":"ContainerStarted","Data":"90f1219a247dce0491f77034d7bda4c84e4bd92d3c8661fc5a5ac126f2006458"} Apr 23 08:14:43.977882 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:43.977829 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-29k85" podStartSLOduration=1.9893455100000002 podStartE2EDuration="2.977811727s" podCreationTimestamp="2026-04-23 08:14:41 +0000 UTC" firstStartedPulling="2026-04-23 08:14:42.200167906 +0000 UTC m=+137.327171710" lastFinishedPulling="2026-04-23 08:14:43.188634121 +0000 UTC m=+138.315637927" observedRunningTime="2026-04-23 08:14:43.976113181 +0000 UTC m=+139.103117013" watchObservedRunningTime="2026-04-23 08:14:43.977811727 +0000 UTC m=+139.104815554" Apr 23 08:14:44.756601 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.756022 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-8499945ff4-znpmj"] Apr 23 08:14:44.759949 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.759928 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.762986 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.762963 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 08:14:44.763224 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.763209 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 08:14:44.763302 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.763214 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 08:14:44.763432 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.763416 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 08:14:44.764211 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.764198 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-24x57\"" Apr 23 08:14:44.764283 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.764241 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-od63c2f652m7\"" Apr 23 08:14:44.764351 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.764329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 08:14:44.771931 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.771911 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8499945ff4-znpmj"] Apr 23 08:14:44.852598 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.852562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-metrics-client-ca\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.852717 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.852627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.852717 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.852663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.852839 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.852727 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.852839 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.852754 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbx4v\" (UniqueName: \"kubernetes.io/projected/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-kube-api-access-bbx4v\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.852839 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.852808 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-tls\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.852960 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.852839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.852960 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.852871 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-grpc-tls\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.951342 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.951311 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q6vg7" event={"ID":"e93d401d-bbd4-4bac-856d-242ed681ac6e","Type":"ContainerStarted","Data":"d16ac51153b6bc047915aab50dc884eb7c8389509e7e9a11614beddaeda2f7a4"} Apr 23 08:14:44.951651 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.951350 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q6vg7" event={"ID":"e93d401d-bbd4-4bac-856d-242ed681ac6e","Type":"ContainerStarted","Data":"56b76d76ee37442f28f1ec55a0ae0c6addaa7d0b5a78d3a3e2507406525b06e2"} Apr 23 08:14:44.953149 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953127 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.953305 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.953305 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953206 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.953305 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953220 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" event={"ID":"7cf1f6b7-9b17-42b9-808b-1e95514cc372","Type":"ContainerStarted","Data":"29bbdd1cf211af126ef93bf249be870e139acb75662bb610e961a141f3f741f7"} Apr 23 08:14:44.953305 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953248 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" event={"ID":"7cf1f6b7-9b17-42b9-808b-1e95514cc372","Type":"ContainerStarted","Data":"c4b1b5febad83e792b4bfdc24ff3072d5af306f5cd8c2aeadd6fbf59b6a40ad2"} Apr 23 08:14:44.953305 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953258 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" event={"ID":"7cf1f6b7-9b17-42b9-808b-1e95514cc372","Type":"ContainerStarted","Data":"781ef5a2d7d2a4db0299ee0383078a96736f1f90c7067dd1d263838054198898"} Apr 23 08:14:44.953305 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbx4v\" (UniqueName: \"kubernetes.io/projected/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-kube-api-access-bbx4v\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.953305 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953305 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-tls\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.953653 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.953653 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-grpc-tls\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.953653 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.953439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-metrics-client-ca\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.954232 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.954210 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-metrics-client-ca\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.956499 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.956472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.956569 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.956472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.956616 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.956597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.956735 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.956714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-tls\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.956775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.956743 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-grpc-tls\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.956775 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.956719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.960627 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.960608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbx4v\" (UniqueName: \"kubernetes.io/projected/a7b0d765-1bbf-4aee-a093-b38a04ab37a7-kube-api-access-bbx4v\") pod \"thanos-querier-8499945ff4-znpmj\" (UID: \"a7b0d765-1bbf-4aee-a093-b38a04ab37a7\") " pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:44.976550 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.976510 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-q6vg7" podStartSLOduration=2.78469804 podStartE2EDuration="3.976496876s" podCreationTimestamp="2026-04-23 08:14:41 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.994031725 +0000 UTC m=+137.121035531" lastFinishedPulling="2026-04-23 08:14:43.185830549 +0000 UTC m=+138.312834367" observedRunningTime="2026-04-23 08:14:44.974889954 +0000 UTC m=+140.101893794" watchObservedRunningTime="2026-04-23 08:14:44.976496876 +0000 UTC m=+140.103500701" Apr 23 08:14:44.992692 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:44.992644 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-nrm4q" podStartSLOduration=2.77036155 podStartE2EDuration="3.99262977s" podCreationTimestamp="2026-04-23 08:14:41 +0000 UTC" firstStartedPulling="2026-04-23 08:14:42.735700297 +0000 UTC m=+137.862704105" lastFinishedPulling="2026-04-23 08:14:43.95796852 +0000 UTC m=+139.084972325" observedRunningTime="2026-04-23 08:14:44.992052 +0000 UTC m=+140.119055843" watchObservedRunningTime="2026-04-23 08:14:44.99262977 +0000 UTC m=+140.119633597" Apr 23 08:14:45.069239 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.069179 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:45.192267 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.192241 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8499945ff4-znpmj"] Apr 23 08:14:45.194413 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:45.194390 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7b0d765_1bbf_4aee_a093_b38a04ab37a7.slice/crio-db28bb6269d02ac2765fd9ff8fd74b5009eeb19aeff5f9557fa5fc855b6d8bef WatchSource:0}: Error finding container db28bb6269d02ac2765fd9ff8fd74b5009eeb19aeff5f9557fa5fc855b6d8bef: Status 404 returned error can't find the container with id db28bb6269d02ac2765fd9ff8fd74b5009eeb19aeff5f9557fa5fc855b6d8bef Apr 23 08:14:45.730486 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.730456 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-jx5bv"] Apr 23 08:14:45.735108 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.735088 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jx5bv" Apr 23 08:14:45.737533 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.737480 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:14:45.737649 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.737615 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:14:45.737848 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.737832 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-cfxk8\"" Apr 23 08:14:45.744894 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.744873 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jx5bv"] Apr 23 08:14:45.862205 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.862166 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltfx6\" (UniqueName: \"kubernetes.io/projected/41b3d83e-0d9c-4724-b970-23cf0ef08d5a-kube-api-access-ltfx6\") pod \"downloads-6bcc868b7-jx5bv\" (UID: \"41b3d83e-0d9c-4724-b970-23cf0ef08d5a\") " pod="openshift-console/downloads-6bcc868b7-jx5bv" Apr 23 08:14:45.957885 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.957848 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" event={"ID":"a7b0d765-1bbf-4aee-a093-b38a04ab37a7","Type":"ContainerStarted","Data":"db28bb6269d02ac2765fd9ff8fd74b5009eeb19aeff5f9557fa5fc855b6d8bef"} Apr 23 08:14:45.963674 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.963655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltfx6\" (UniqueName: \"kubernetes.io/projected/41b3d83e-0d9c-4724-b970-23cf0ef08d5a-kube-api-access-ltfx6\") pod \"downloads-6bcc868b7-jx5bv\" (UID: \"41b3d83e-0d9c-4724-b970-23cf0ef08d5a\") " pod="openshift-console/downloads-6bcc868b7-jx5bv" Apr 23 08:14:45.972509 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:45.972486 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltfx6\" (UniqueName: \"kubernetes.io/projected/41b3d83e-0d9c-4724-b970-23cf0ef08d5a-kube-api-access-ltfx6\") pod \"downloads-6bcc868b7-jx5bv\" (UID: \"41b3d83e-0d9c-4724-b970-23cf0ef08d5a\") " pod="openshift-console/downloads-6bcc868b7-jx5bv" Apr 23 08:14:46.045615 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.045531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jx5bv" Apr 23 08:14:46.182708 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:46.182666 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b3d83e_0d9c_4724_b970_23cf0ef08d5a.slice/crio-e548e8a622ffacf094ed0720b9ac888f346d21121efb1b8351643c81272ce159 WatchSource:0}: Error finding container e548e8a622ffacf094ed0720b9ac888f346d21121efb1b8351643c81272ce159: Status 404 returned error can't find the container with id e548e8a622ffacf094ed0720b9ac888f346d21121efb1b8351643c81272ce159 Apr 23 08:14:46.182834 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.182723 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jx5bv"] Apr 23 08:14:46.425454 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.425424 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2"] Apr 23 08:14:46.429978 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.429958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" Apr 23 08:14:46.432334 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.432306 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 08:14:46.432444 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.432355 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-whsx4\"" Apr 23 08:14:46.438918 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.438900 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2"] Apr 23 08:14:46.569178 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.569143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/769527c1-f7ac-483c-9438-d30160a91d00-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hjrx2\" (UID: \"769527c1-f7ac-483c-9438-d30160a91d00\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" Apr 23 08:14:46.669896 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.669866 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/769527c1-f7ac-483c-9438-d30160a91d00-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hjrx2\" (UID: \"769527c1-f7ac-483c-9438-d30160a91d00\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" Apr 23 08:14:46.672405 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.672379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/769527c1-f7ac-483c-9438-d30160a91d00-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hjrx2\" (UID: \"769527c1-f7ac-483c-9438-d30160a91d00\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" Apr 23 08:14:46.739450 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.739373 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" Apr 23 08:14:46.962176 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:46.962130 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jx5bv" event={"ID":"41b3d83e-0d9c-4724-b970-23cf0ef08d5a","Type":"ContainerStarted","Data":"e548e8a622ffacf094ed0720b9ac888f346d21121efb1b8351643c81272ce159"} Apr 23 08:14:47.464836 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:47.464669 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2"] Apr 23 08:14:47.470986 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:14:47.470951 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod769527c1_f7ac_483c_9438_d30160a91d00.slice/crio-19999ae65102be254cbaa979265238dc0ec61b435ec689c1deb2758b4b9b091c WatchSource:0}: Error finding container 19999ae65102be254cbaa979265238dc0ec61b435ec689c1deb2758b4b9b091c: Status 404 returned error can't find the container with id 19999ae65102be254cbaa979265238dc0ec61b435ec689c1deb2758b4b9b091c Apr 23 08:14:47.966864 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:47.966828 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" event={"ID":"769527c1-f7ac-483c-9438-d30160a91d00","Type":"ContainerStarted","Data":"19999ae65102be254cbaa979265238dc0ec61b435ec689c1deb2758b4b9b091c"} Apr 23 08:14:47.969754 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:47.969725 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" event={"ID":"a7b0d765-1bbf-4aee-a093-b38a04ab37a7","Type":"ContainerStarted","Data":"b5ece3fae294b2ef45fc6c231f4e10c6ae1b39b1adc6650baad5f61b944dfd70"} Apr 23 08:14:47.969882 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:47.969758 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" event={"ID":"a7b0d765-1bbf-4aee-a093-b38a04ab37a7","Type":"ContainerStarted","Data":"f4b4935ef01a0a804b5286208c7f58f53fc2571ca3f88920cdb1839d6f13df90"} Apr 23 08:14:47.969882 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:47.969767 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" event={"ID":"a7b0d765-1bbf-4aee-a093-b38a04ab37a7","Type":"ContainerStarted","Data":"e899ebd786d832e5ee0c25e001f37f2d333dbc464dbca146608b2974bcf5d773"} Apr 23 08:14:49.977331 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:49.977289 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" event={"ID":"769527c1-f7ac-483c-9438-d30160a91d00","Type":"ContainerStarted","Data":"78ebb98e50cdf78053ce874d5b266c2d9645ad40642f6e2dd1d01811ad1f4dd6"} Apr 23 08:14:49.977787 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:49.977478 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" Apr 23 08:14:49.980313 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:49.980285 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" event={"ID":"a7b0d765-1bbf-4aee-a093-b38a04ab37a7","Type":"ContainerStarted","Data":"e904013807c560d9a079606cfddf6ff424211bd59efc69d16b4ce78bdd1d7fa5"} Apr 23 08:14:49.980460 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:49.980319 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" event={"ID":"a7b0d765-1bbf-4aee-a093-b38a04ab37a7","Type":"ContainerStarted","Data":"aa1b2b2dc36dd7786985fa5f4e5c21aa59902066eebf31ab906650dc776c26f4"} Apr 23 08:14:49.980460 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:49.980333 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" event={"ID":"a7b0d765-1bbf-4aee-a093-b38a04ab37a7","Type":"ContainerStarted","Data":"b0b7098a6b8a522a64b1f7de1c9e7c453e8dcd6150c72767d24458b30ac4bf50"} Apr 23 08:14:49.980561 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:49.980471 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:49.983747 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:49.983723 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" Apr 23 08:14:49.994647 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:49.994596 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hjrx2" podStartSLOduration=2.50697309 podStartE2EDuration="3.994581567s" podCreationTimestamp="2026-04-23 08:14:46 +0000 UTC" firstStartedPulling="2026-04-23 08:14:47.473758073 +0000 UTC m=+142.600761878" lastFinishedPulling="2026-04-23 08:14:48.961366551 +0000 UTC m=+144.088370355" observedRunningTime="2026-04-23 08:14:49.99237784 +0000 UTC m=+145.119381664" watchObservedRunningTime="2026-04-23 08:14:49.994581567 +0000 UTC m=+145.121585396" Apr 23 08:14:50.012329 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:50.012287 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" podStartSLOduration=2.251026491 podStartE2EDuration="6.012272624s" podCreationTimestamp="2026-04-23 08:14:44 +0000 UTC" firstStartedPulling="2026-04-23 08:14:45.196656013 +0000 UTC m=+140.323659821" lastFinishedPulling="2026-04-23 08:14:48.957902151 +0000 UTC m=+144.084905954" observedRunningTime="2026-04-23 08:14:50.011046048 +0000 UTC m=+145.138049886" watchObservedRunningTime="2026-04-23 08:14:50.012272624 +0000 UTC m=+145.139276451" Apr 23 08:14:54.463533 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:54.463504 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84ff9c579b-q6lss"] Apr 23 08:14:54.463958 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:14:54.463773 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" podUID="7580ab91-3a6f-405a-b1b4-402044f4cd59" Apr 23 08:14:54.995851 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:54.995822 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:14:54.999967 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:54.999949 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:14:55.147071 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147031 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-image-registry-private-configuration\") pod \"7580ab91-3a6f-405a-b1b4-402044f4cd59\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " Apr 23 08:14:55.147209 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147097 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7580ab91-3a6f-405a-b1b4-402044f4cd59-ca-trust-extracted\") pod \"7580ab91-3a6f-405a-b1b4-402044f4cd59\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " Apr 23 08:14:55.147209 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147127 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv5ss\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-kube-api-access-tv5ss\") pod \"7580ab91-3a6f-405a-b1b4-402044f4cd59\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " Apr 23 08:14:55.147209 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147180 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-bound-sa-token\") pod \"7580ab91-3a6f-405a-b1b4-402044f4cd59\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " Apr 23 08:14:55.147381 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147221 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-trusted-ca\") pod \"7580ab91-3a6f-405a-b1b4-402044f4cd59\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " Apr 23 08:14:55.147381 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147267 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-installation-pull-secrets\") pod \"7580ab91-3a6f-405a-b1b4-402044f4cd59\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " Apr 23 08:14:55.147381 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147301 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-certificates\") pod \"7580ab91-3a6f-405a-b1b4-402044f4cd59\" (UID: \"7580ab91-3a6f-405a-b1b4-402044f4cd59\") " Apr 23 08:14:55.147381 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147351 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7580ab91-3a6f-405a-b1b4-402044f4cd59-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7580ab91-3a6f-405a-b1b4-402044f4cd59" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:14:55.147824 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147687 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7580ab91-3a6f-405a-b1b4-402044f4cd59-ca-trust-extracted\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:14:55.147824 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147772 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7580ab91-3a6f-405a-b1b4-402044f4cd59" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:14:55.148108 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.147982 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7580ab91-3a6f-405a-b1b4-402044f4cd59" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:14:55.150046 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.149976 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7580ab91-3a6f-405a-b1b4-402044f4cd59" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:14:55.150046 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.149994 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7580ab91-3a6f-405a-b1b4-402044f4cd59" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:14:55.150216 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.150110 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7580ab91-3a6f-405a-b1b4-402044f4cd59" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:14:55.150216 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.150110 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-kube-api-access-tv5ss" (OuterVolumeSpecName: "kube-api-access-tv5ss") pod "7580ab91-3a6f-405a-b1b4-402044f4cd59" (UID: "7580ab91-3a6f-405a-b1b4-402044f4cd59"). InnerVolumeSpecName "kube-api-access-tv5ss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:14:55.248937 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.248868 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-image-registry-private-configuration\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:14:55.248937 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.248893 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tv5ss\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-kube-api-access-tv5ss\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:14:55.248937 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.248904 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-bound-sa-token\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:14:55.248937 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.248914 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-trusted-ca\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:14:55.248937 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.248922 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7580ab91-3a6f-405a-b1b4-402044f4cd59-installation-pull-secrets\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:14:55.248937 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.248932 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-certificates\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:14:55.991193 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.991162 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-8499945ff4-znpmj" Apr 23 08:14:55.999321 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:55.999291 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84ff9c579b-q6lss" Apr 23 08:14:56.042293 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:56.042259 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84ff9c579b-q6lss"] Apr 23 08:14:56.047909 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:56.047879 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-84ff9c579b-q6lss"] Apr 23 08:14:56.157355 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:56.157318 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7580ab91-3a6f-405a-b1b4-402044f4cd59-registry-tls\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:14:57.536323 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:14:57.536274 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7580ab91-3a6f-405a-b1b4-402044f4cd59" path="/var/lib/kubelet/pods/7580ab91-3a6f-405a-b1b4-402044f4cd59/volumes" Apr 23 08:15:02.774925 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:15:02.774830 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-s2qrh" podUID="b1a03cf0-a55d-4e77-9ca0-a33f942b3b24" Apr 23 08:15:02.786000 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:15:02.785970 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8969w" podUID="0721f5e7-8efa-45d5-a74c-776e204f81c6" Apr 23 08:15:03.021251 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:03.021219 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s2qrh" Apr 23 08:15:03.021442 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:03.021219 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:15:04.028425 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.028389 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jx5bv" event={"ID":"41b3d83e-0d9c-4724-b970-23cf0ef08d5a","Type":"ContainerStarted","Data":"0c693164fc22d6edfb085a161d54ede8b20f9baa7109e87890f8d90e5361da08"} Apr 23 08:15:04.028881 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.028606 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-jx5bv" Apr 23 08:15:04.044423 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.044394 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-jx5bv" Apr 23 08:15:04.048158 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.048079 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-jx5bv" podStartSLOduration=1.778439573 podStartE2EDuration="19.048046784s" podCreationTimestamp="2026-04-23 08:14:45 +0000 UTC" firstStartedPulling="2026-04-23 08:14:46.184417284 +0000 UTC m=+141.311421089" lastFinishedPulling="2026-04-23 08:15:03.454024295 +0000 UTC m=+158.581028300" observedRunningTime="2026-04-23 08:15:04.045462513 +0000 UTC m=+159.172466360" watchObservedRunningTime="2026-04-23 08:15:04.048046784 +0000 UTC m=+159.175050609" Apr 23 08:15:04.450750 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.450708 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cbf5fc696-khdq5"] Apr 23 08:15:04.455520 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.455492 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.463594 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.463565 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:15:04.465075 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.463845 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:15:04.465075 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.463848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:15:04.465075 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.464238 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:15:04.465075 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.464469 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-s87b2\"" Apr 23 08:15:04.465434 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.465183 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:15:04.470244 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.470221 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 08:15:04.483474 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.483427 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cbf5fc696-khdq5"] Apr 23 08:15:04.634319 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.634282 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-config\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.634500 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.634352 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-service-ca\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.634500 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.634392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-serving-cert\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.634500 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.634437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqhd\" (UniqueName: \"kubernetes.io/projected/a74f1f05-18fe-4d8b-b37a-980fffd64051-kube-api-access-xbqhd\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.634646 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.634498 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-oauth-config\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.634646 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.634546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-oauth-serving-cert\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.634646 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.634608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-trusted-ca-bundle\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.735758 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.735676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-serving-cert\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.735758 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.735743 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqhd\" (UniqueName: \"kubernetes.io/projected/a74f1f05-18fe-4d8b-b37a-980fffd64051-kube-api-access-xbqhd\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.735962 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.735777 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-oauth-config\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.735962 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.735802 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-oauth-serving-cert\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.735962 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.735848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-trusted-ca-bundle\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.735962 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.735907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-config\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.736216 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.735963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-service-ca\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.736751 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.736726 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-service-ca\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.736859 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.736820 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-config\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.736926 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.736894 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-oauth-serving-cert\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.737128 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.736976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-trusted-ca-bundle\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.738683 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.738660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-serving-cert\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.738766 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.738744 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-oauth-config\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.745323 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.745299 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqhd\" (UniqueName: \"kubernetes.io/projected/a74f1f05-18fe-4d8b-b37a-980fffd64051-kube-api-access-xbqhd\") pod \"console-5cbf5fc696-khdq5\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.769649 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.769285 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:04.915412 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:04.915389 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cbf5fc696-khdq5"] Apr 23 08:15:04.918388 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:15:04.918363 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74f1f05_18fe_4d8b_b37a_980fffd64051.slice/crio-28fbe055387ac77c6a61a0c0e2150ce445e716f60111d3104b9b5e0ee443d2bf WatchSource:0}: Error finding container 28fbe055387ac77c6a61a0c0e2150ce445e716f60111d3104b9b5e0ee443d2bf: Status 404 returned error can't find the container with id 28fbe055387ac77c6a61a0c0e2150ce445e716f60111d3104b9b5e0ee443d2bf Apr 23 08:15:05.033455 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:05.033376 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cbf5fc696-khdq5" event={"ID":"a74f1f05-18fe-4d8b-b37a-980fffd64051","Type":"ContainerStarted","Data":"28fbe055387ac77c6a61a0c0e2150ce445e716f60111d3104b9b5e0ee443d2bf"} Apr 23 08:15:07.664640 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:07.664425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:15:07.667402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:07.667379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a03cf0-a55d-4e77-9ca0-a33f942b3b24-metrics-tls\") pod \"dns-default-s2qrh\" (UID: \"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24\") " pod="openshift-dns/dns-default-s2qrh" Apr 23 08:15:07.765751 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:07.765710 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:15:07.768919 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:07.768893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0721f5e7-8efa-45d5-a74c-776e204f81c6-cert\") pod \"ingress-canary-8969w\" (UID: \"0721f5e7-8efa-45d5-a74c-776e204f81c6\") " pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:15:07.824873 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:07.824840 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sb9bx\"" Apr 23 08:15:07.825886 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:07.825851 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dcms6\"" Apr 23 08:15:07.832454 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:07.832434 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s2qrh" Apr 23 08:15:07.832646 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:07.832565 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8969w" Apr 23 08:15:08.374873 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:08.374786 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s2qrh"] Apr 23 08:15:08.378200 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:15:08.378167 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a03cf0_a55d_4e77_9ca0_a33f942b3b24.slice/crio-b75cfaa0597207fa4427c69bbb244b711927f6e92a1375f8badb1df0acea1fae WatchSource:0}: Error finding container b75cfaa0597207fa4427c69bbb244b711927f6e92a1375f8badb1df0acea1fae: Status 404 returned error can't find the container with id b75cfaa0597207fa4427c69bbb244b711927f6e92a1375f8badb1df0acea1fae Apr 23 08:15:08.401930 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:08.401903 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8969w"] Apr 23 08:15:09.048749 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:09.048711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cbf5fc696-khdq5" event={"ID":"a74f1f05-18fe-4d8b-b37a-980fffd64051","Type":"ContainerStarted","Data":"2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522"} Apr 23 08:15:09.051125 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:09.051095 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s2qrh" event={"ID":"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24","Type":"ContainerStarted","Data":"b75cfaa0597207fa4427c69bbb244b711927f6e92a1375f8badb1df0acea1fae"} Apr 23 08:15:09.052938 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:09.052893 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8969w" event={"ID":"0721f5e7-8efa-45d5-a74c-776e204f81c6","Type":"ContainerStarted","Data":"2a878ecfbc7a4d04217d7d16c7888f24ad9b1df474060ef696e3ad93c893c84b"} Apr 23 08:15:12.063792 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:12.063702 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8969w" event={"ID":"0721f5e7-8efa-45d5-a74c-776e204f81c6","Type":"ContainerStarted","Data":"e311c953e3f27d08b83f97dd822e6ea715decdb813b6215e680e3e22a28d01ec"} Apr 23 08:15:12.065652 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:12.065613 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s2qrh" event={"ID":"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24","Type":"ContainerStarted","Data":"dd5ecfe947895cf2ad1fd45e5ddd70096313f16249708790179c1a3a507fa3a4"} Apr 23 08:15:12.065652 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:12.065649 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s2qrh" event={"ID":"b1a03cf0-a55d-4e77-9ca0-a33f942b3b24","Type":"ContainerStarted","Data":"47607ce7d4f12c1439222c1545b549ad715c06b2ddbac1613be639e7507d2df5"} Apr 23 08:15:12.065823 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:12.065752 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-s2qrh" Apr 23 08:15:12.079755 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:12.079691 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8969w" podStartSLOduration=130.045443582 podStartE2EDuration="2m13.079677341s" podCreationTimestamp="2026-04-23 08:12:59 +0000 UTC" firstStartedPulling="2026-04-23 08:15:08.517766556 +0000 UTC m=+163.644770362" lastFinishedPulling="2026-04-23 08:15:11.552000308 +0000 UTC m=+166.679004121" observedRunningTime="2026-04-23 08:15:12.078813003 +0000 UTC m=+167.205816841" watchObservedRunningTime="2026-04-23 08:15:12.079677341 +0000 UTC m=+167.206681168" Apr 23 08:15:12.080246 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:12.080207 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cbf5fc696-khdq5" podStartSLOduration=4.474979139 podStartE2EDuration="8.080196662s" podCreationTimestamp="2026-04-23 08:15:04 +0000 UTC" firstStartedPulling="2026-04-23 08:15:04.920720332 +0000 UTC m=+160.047724137" lastFinishedPulling="2026-04-23 08:15:08.525937856 +0000 UTC m=+163.652941660" observedRunningTime="2026-04-23 08:15:09.071992789 +0000 UTC m=+164.198996615" watchObservedRunningTime="2026-04-23 08:15:12.080196662 +0000 UTC m=+167.207200487" Apr 23 08:15:12.097731 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:12.097676 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s2qrh" podStartSLOduration=129.930943159 podStartE2EDuration="2m13.097663579s" podCreationTimestamp="2026-04-23 08:12:59 +0000 UTC" firstStartedPulling="2026-04-23 08:15:08.379894377 +0000 UTC m=+163.506898197" lastFinishedPulling="2026-04-23 08:15:11.546614805 +0000 UTC m=+166.673618617" observedRunningTime="2026-04-23 08:15:12.095621701 +0000 UTC m=+167.222625529" watchObservedRunningTime="2026-04-23 08:15:12.097663579 +0000 UTC m=+167.224667405" Apr 23 08:15:14.770162 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:14.770122 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:14.770162 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:14.770171 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:14.774766 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:14.774745 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:15.079572 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:15.079492 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:15:22.073038 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:22.073008 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s2qrh" Apr 23 08:15:23.098483 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:23.098440 2579 generic.go:358] "Generic (PLEG): container finished" podID="1a2fda67-d89f-4fa0-a61a-91dfc33b57fd" containerID="cd417f62c894e9b362d222223a9e16abab67ee58b0907d3f0d024d97f3af6bc0" exitCode=0 Apr 23 08:15:23.098840 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:23.098520 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-n42s5" event={"ID":"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd","Type":"ContainerDied","Data":"cd417f62c894e9b362d222223a9e16abab67ee58b0907d3f0d024d97f3af6bc0"} Apr 23 08:15:23.098883 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:23.098870 2579 scope.go:117] "RemoveContainer" containerID="cd417f62c894e9b362d222223a9e16abab67ee58b0907d3f0d024d97f3af6bc0" Apr 23 08:15:23.608243 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:23.608173 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s2qrh_b1a03cf0-a55d-4e77-9ca0-a33f942b3b24/dns/0.log" Apr 23 08:15:23.613520 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:23.613502 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s2qrh_b1a03cf0-a55d-4e77-9ca0-a33f942b3b24/kube-rbac-proxy/0.log" Apr 23 08:15:23.890182 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:23.890161 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hnjm9_cf4bc19e-bb95-4e2e-9978-4a53f064696d/dns-node-resolver/0.log" Apr 23 08:15:24.103332 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:15:24.103301 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-n42s5" event={"ID":"1a2fda67-d89f-4fa0-a61a-91dfc33b57fd","Type":"ContainerStarted","Data":"01e05768e8b0a82ec7bd31de306ced87bca58761e060a50a33fde3dc073e9eed"} Apr 23 08:16:06.071345 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.071315 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9"] Apr 23 08:16:06.076393 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.076370 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.078792 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.078768 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 08:16:06.078960 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.078850 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-j9nvn\"" Apr 23 08:16:06.079090 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.078861 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 08:16:06.079189 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.079008 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 08:16:06.079257 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.079211 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 08:16:06.079349 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.079322 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 08:16:06.083558 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.083538 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 08:16:06.091538 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.091517 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9"] Apr 23 08:16:06.228085 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.228024 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-metrics-client-ca\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.228262 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.228097 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.228262 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.228150 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-serving-certs-ca-bundle\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.228262 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.228168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-telemeter-client-tls\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.228262 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.228237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-federate-client-tls\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.228422 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.228273 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-secret-telemeter-client\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.228422 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.228314 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pfmb\" (UniqueName: \"kubernetes.io/projected/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-kube-api-access-9pfmb\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.228422 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.228361 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.328875 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.328801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-serving-certs-ca-bundle\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.328875 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.328833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-telemeter-client-tls\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.328875 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.328853 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-federate-client-tls\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.328875 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.328873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-secret-telemeter-client\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.329182 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.328903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pfmb\" (UniqueName: \"kubernetes.io/projected/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-kube-api-access-9pfmb\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.329182 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.328947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.329182 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.328964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-metrics-client-ca\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.329182 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.328980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.329663 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.329627 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-serving-certs-ca-bundle\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.329771 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.329722 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-metrics-client-ca\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.329826 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.329798 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.332407 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.332380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-telemeter-client-tls\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.332492 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.332413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-secret-telemeter-client\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.332492 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.332461 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.332492 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.332481 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-federate-client-tls\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.337764 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.337737 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pfmb\" (UniqueName: \"kubernetes.io/projected/1f14cf83-8b6a-404f-bb4f-02f6a51d1596-kube-api-access-9pfmb\") pod \"telemeter-client-6d5bbd96d6-ql9h9\" (UID: \"1f14cf83-8b6a-404f-bb4f-02f6a51d1596\") " pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.386107 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.386078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" Apr 23 08:16:06.517227 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:06.517190 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9"] Apr 23 08:16:06.521424 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:16:06.521393 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f14cf83_8b6a_404f_bb4f_02f6a51d1596.slice/crio-e88049eca9848d85ed2fc6bfcd80775979fbab42c490e6e369d18dab4b3aaf58 WatchSource:0}: Error finding container e88049eca9848d85ed2fc6bfcd80775979fbab42c490e6e369d18dab4b3aaf58: Status 404 returned error can't find the container with id e88049eca9848d85ed2fc6bfcd80775979fbab42c490e6e369d18dab4b3aaf58 Apr 23 08:16:07.235896 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:07.235858 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" event={"ID":"1f14cf83-8b6a-404f-bb4f-02f6a51d1596","Type":"ContainerStarted","Data":"e88049eca9848d85ed2fc6bfcd80775979fbab42c490e6e369d18dab4b3aaf58"} Apr 23 08:16:09.243175 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:09.243094 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" event={"ID":"1f14cf83-8b6a-404f-bb4f-02f6a51d1596","Type":"ContainerStarted","Data":"4f7a49187bd0c52b9fe8626ea05b7b00975f968206671b0a31ffae638dff7c3f"} Apr 23 08:16:11.250988 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:11.250953 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" event={"ID":"1f14cf83-8b6a-404f-bb4f-02f6a51d1596","Type":"ContainerStarted","Data":"a069dea53d9b8434889e7388287ff8b63bb983058039d51356ee6bb7a467a692"} Apr 23 08:16:11.250988 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:11.250992 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" event={"ID":"1f14cf83-8b6a-404f-bb4f-02f6a51d1596","Type":"ContainerStarted","Data":"ddbdce5638eec51ff456a1bb5287b9211c3d7fb671c8ad2a8a73baaff73012b4"} Apr 23 08:16:11.276586 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:11.276530 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6d5bbd96d6-ql9h9" podStartSLOduration=1.149661473 podStartE2EDuration="5.276514181s" podCreationTimestamp="2026-04-23 08:16:06 +0000 UTC" firstStartedPulling="2026-04-23 08:16:06.523194514 +0000 UTC m=+221.650198318" lastFinishedPulling="2026-04-23 08:16:10.650047213 +0000 UTC m=+225.777051026" observedRunningTime="2026-04-23 08:16:11.274881191 +0000 UTC m=+226.401885016" watchObservedRunningTime="2026-04-23 08:16:11.276514181 +0000 UTC m=+226.403518010" Apr 23 08:16:22.315360 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:22.315322 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cbf5fc696-khdq5"] Apr 23 08:16:47.336952 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.336905 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5cbf5fc696-khdq5" podUID="a74f1f05-18fe-4d8b-b37a-980fffd64051" containerName="console" containerID="cri-o://2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522" gracePeriod=15 Apr 23 08:16:47.584412 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.584390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cbf5fc696-khdq5_a74f1f05-18fe-4d8b-b37a-980fffd64051/console/0.log" Apr 23 08:16:47.584565 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.584447 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:16:47.640144 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640108 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-service-ca\") pod \"a74f1f05-18fe-4d8b-b37a-980fffd64051\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " Apr 23 08:16:47.640285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640166 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-oauth-config\") pod \"a74f1f05-18fe-4d8b-b37a-980fffd64051\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " Apr 23 08:16:47.640285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640194 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-oauth-serving-cert\") pod \"a74f1f05-18fe-4d8b-b37a-980fffd64051\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " Apr 23 08:16:47.640285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640218 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-config\") pod \"a74f1f05-18fe-4d8b-b37a-980fffd64051\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " Apr 23 08:16:47.640285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640234 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-trusted-ca-bundle\") pod \"a74f1f05-18fe-4d8b-b37a-980fffd64051\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " Apr 23 08:16:47.640285 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640260 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-serving-cert\") pod \"a74f1f05-18fe-4d8b-b37a-980fffd64051\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " Apr 23 08:16:47.640515 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640299 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbqhd\" (UniqueName: \"kubernetes.io/projected/a74f1f05-18fe-4d8b-b37a-980fffd64051-kube-api-access-xbqhd\") pod \"a74f1f05-18fe-4d8b-b37a-980fffd64051\" (UID: \"a74f1f05-18fe-4d8b-b37a-980fffd64051\") " Apr 23 08:16:47.640590 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640562 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-config" (OuterVolumeSpecName: "console-config") pod "a74f1f05-18fe-4d8b-b37a-980fffd64051" (UID: "a74f1f05-18fe-4d8b-b37a-980fffd64051"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:47.640646 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640581 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a74f1f05-18fe-4d8b-b37a-980fffd64051" (UID: "a74f1f05-18fe-4d8b-b37a-980fffd64051"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:47.640699 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640645 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-service-ca" (OuterVolumeSpecName: "service-ca") pod "a74f1f05-18fe-4d8b-b37a-980fffd64051" (UID: "a74f1f05-18fe-4d8b-b37a-980fffd64051"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:47.640783 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.640746 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a74f1f05-18fe-4d8b-b37a-980fffd64051" (UID: "a74f1f05-18fe-4d8b-b37a-980fffd64051"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:47.642587 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.642560 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a74f1f05-18fe-4d8b-b37a-980fffd64051" (UID: "a74f1f05-18fe-4d8b-b37a-980fffd64051"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:47.642677 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.642607 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a74f1f05-18fe-4d8b-b37a-980fffd64051" (UID: "a74f1f05-18fe-4d8b-b37a-980fffd64051"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:47.642677 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.642619 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74f1f05-18fe-4d8b-b37a-980fffd64051-kube-api-access-xbqhd" (OuterVolumeSpecName: "kube-api-access-xbqhd") pod "a74f1f05-18fe-4d8b-b37a-980fffd64051" (UID: "a74f1f05-18fe-4d8b-b37a-980fffd64051"). InnerVolumeSpecName "kube-api-access-xbqhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:16:47.741863 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.741828 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbqhd\" (UniqueName: \"kubernetes.io/projected/a74f1f05-18fe-4d8b-b37a-980fffd64051-kube-api-access-xbqhd\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:16:47.741863 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.741857 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-service-ca\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:16:47.741863 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.741866 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-oauth-config\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:16:47.742113 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.741875 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-oauth-serving-cert\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:16:47.742113 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.741886 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-config\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:16:47.742113 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.741895 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74f1f05-18fe-4d8b-b37a-980fffd64051-trusted-ca-bundle\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:16:47.742113 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:47.741903 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a74f1f05-18fe-4d8b-b37a-980fffd64051-console-serving-cert\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:16:48.357917 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.357891 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cbf5fc696-khdq5_a74f1f05-18fe-4d8b-b37a-980fffd64051/console/0.log" Apr 23 08:16:48.358366 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.357929 2579 generic.go:358] "Generic (PLEG): container finished" podID="a74f1f05-18fe-4d8b-b37a-980fffd64051" containerID="2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522" exitCode=2 Apr 23 08:16:48.358366 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.358025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cbf5fc696-khdq5" event={"ID":"a74f1f05-18fe-4d8b-b37a-980fffd64051","Type":"ContainerDied","Data":"2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522"} Apr 23 08:16:48.358366 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.358033 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cbf5fc696-khdq5" Apr 23 08:16:48.358366 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.358076 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cbf5fc696-khdq5" event={"ID":"a74f1f05-18fe-4d8b-b37a-980fffd64051","Type":"ContainerDied","Data":"28fbe055387ac77c6a61a0c0e2150ce445e716f60111d3104b9b5e0ee443d2bf"} Apr 23 08:16:48.358366 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.358109 2579 scope.go:117] "RemoveContainer" containerID="2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522" Apr 23 08:16:48.369810 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.369790 2579 scope.go:117] "RemoveContainer" containerID="2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522" Apr 23 08:16:48.370048 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:16:48.370028 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522\": container with ID starting with 2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522 not found: ID does not exist" containerID="2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522" Apr 23 08:16:48.370124 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.370075 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522"} err="failed to get container status \"2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522\": rpc error: code = NotFound desc = could not find container \"2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522\": container with ID starting with 2cee608322304960d674c6098bd9678f82917c599b424a16987937bcb2d4d522 not found: ID does not exist" Apr 23 08:16:48.381918 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.381894 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cbf5fc696-khdq5"] Apr 23 08:16:48.386918 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:48.386897 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5cbf5fc696-khdq5"] Apr 23 08:16:49.535139 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:16:49.535110 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74f1f05-18fe-4d8b-b37a-980fffd64051" path="/var/lib/kubelet/pods/a74f1f05-18fe-4d8b-b37a-980fffd64051/volumes" Apr 23 08:17:25.399565 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:25.399538 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-acl-logging/0.log" Apr 23 08:17:25.399971 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:25.399670 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-acl-logging/0.log" Apr 23 08:17:25.402427 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:25.402406 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:17:39.428108 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.428073 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56c649c4cb-vbvbb"] Apr 23 08:17:39.430440 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.428403 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a74f1f05-18fe-4d8b-b37a-980fffd64051" containerName="console" Apr 23 08:17:39.430440 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.428417 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74f1f05-18fe-4d8b-b37a-980fffd64051" containerName="console" Apr 23 08:17:39.430440 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.428472 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a74f1f05-18fe-4d8b-b37a-980fffd64051" containerName="console" Apr 23 08:17:39.431296 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.431279 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.436651 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.436614 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:17:39.436882 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.436860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:17:39.436961 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.436920 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-s87b2\"" Apr 23 08:17:39.436961 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.436947 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:17:39.437083 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.436965 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:17:39.438119 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.437909 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:17:39.440732 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.440712 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 08:17:39.444906 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.444887 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56c649c4cb-vbvbb"] Apr 23 08:17:39.522736 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.522707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-oauth-serving-cert\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.522893 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.522743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-service-ca\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.522893 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.522769 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-config\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.522893 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.522793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-trusted-ca-bundle\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.522893 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.522829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-oauth-config\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.523034 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.522900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-serving-cert\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.523034 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.522936 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf9v4\" (UniqueName: \"kubernetes.io/projected/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-kube-api-access-mf9v4\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.623544 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.623504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-oauth-serving-cert\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.623544 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.623551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-service-ca\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.623756 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.623683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-config\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.623756 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.623727 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-trusted-ca-bundle\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.623861 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.623772 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-oauth-config\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.623861 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.623817 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-serving-cert\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.623861 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.623845 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf9v4\" (UniqueName: \"kubernetes.io/projected/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-kube-api-access-mf9v4\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.624344 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.624321 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-oauth-serving-cert\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.624446 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.624321 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-service-ca\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.624446 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.624357 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-config\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.624728 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.624573 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-trusted-ca-bundle\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.626402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.626380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-oauth-config\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.626481 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.626465 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-console-serving-cert\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.632223 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.632201 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf9v4\" (UniqueName: \"kubernetes.io/projected/cb4e65b0-e6a4-460a-b7cd-342ac8714a2b-kube-api-access-mf9v4\") pod \"console-56c649c4cb-vbvbb\" (UID: \"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b\") " pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.741296 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.741218 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:39.869385 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.869349 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56c649c4cb-vbvbb"] Apr 23 08:17:39.872231 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:17:39.872200 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb4e65b0_e6a4_460a_b7cd_342ac8714a2b.slice/crio-710acde8485889dcd1abc171cab801303f863ae317d62df95f460db746ad8d62 WatchSource:0}: Error finding container 710acde8485889dcd1abc171cab801303f863ae317d62df95f460db746ad8d62: Status 404 returned error can't find the container with id 710acde8485889dcd1abc171cab801303f863ae317d62df95f460db746ad8d62 Apr 23 08:17:39.874110 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:39.874092 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:17:40.512180 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:40.512139 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c649c4cb-vbvbb" event={"ID":"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b","Type":"ContainerStarted","Data":"cad4ad62d3155ab59629289efa2e7e035985061f8df9dd548c58193c744dc770"} Apr 23 08:17:40.512180 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:40.512182 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c649c4cb-vbvbb" event={"ID":"cb4e65b0-e6a4-460a-b7cd-342ac8714a2b","Type":"ContainerStarted","Data":"710acde8485889dcd1abc171cab801303f863ae317d62df95f460db746ad8d62"} Apr 23 08:17:40.531240 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:40.531188 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56c649c4cb-vbvbb" podStartSLOduration=1.531170929 podStartE2EDuration="1.531170929s" podCreationTimestamp="2026-04-23 08:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:17:40.529782869 +0000 UTC m=+315.656786695" watchObservedRunningTime="2026-04-23 08:17:40.531170929 +0000 UTC m=+315.658174756" Apr 23 08:17:49.742245 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:49.742212 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:49.742601 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:49.742496 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:49.748097 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:49.747841 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:17:50.542476 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:17:50.542445 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56c649c4cb-vbvbb" Apr 23 08:18:31.105825 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.105752 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zvh94/must-gather-hf492"] Apr 23 08:18:31.109225 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.109205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:18:31.111587 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.111556 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zvh94\"/\"default-dockercfg-f7ghp\"" Apr 23 08:18:31.111689 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.111593 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zvh94\"/\"kube-root-ca.crt\"" Apr 23 08:18:31.111860 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.111844 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zvh94\"/\"openshift-service-ca.crt\"" Apr 23 08:18:31.116394 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.116374 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zvh94/must-gather-hf492"] Apr 23 08:18:31.210424 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.210395 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d72ac3d-027c-4ef4-afed-2c030a463968-must-gather-output\") pod \"must-gather-hf492\" (UID: \"0d72ac3d-027c-4ef4-afed-2c030a463968\") " pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:18:31.210566 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.210452 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lvsk\" (UniqueName: \"kubernetes.io/projected/0d72ac3d-027c-4ef4-afed-2c030a463968-kube-api-access-6lvsk\") pod \"must-gather-hf492\" (UID: \"0d72ac3d-027c-4ef4-afed-2c030a463968\") " pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:18:31.311009 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.310983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lvsk\" (UniqueName: \"kubernetes.io/projected/0d72ac3d-027c-4ef4-afed-2c030a463968-kube-api-access-6lvsk\") pod \"must-gather-hf492\" (UID: \"0d72ac3d-027c-4ef4-afed-2c030a463968\") " pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:18:31.311125 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.311035 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d72ac3d-027c-4ef4-afed-2c030a463968-must-gather-output\") pod \"must-gather-hf492\" (UID: \"0d72ac3d-027c-4ef4-afed-2c030a463968\") " pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:18:31.311310 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.311294 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d72ac3d-027c-4ef4-afed-2c030a463968-must-gather-output\") pod \"must-gather-hf492\" (UID: \"0d72ac3d-027c-4ef4-afed-2c030a463968\") " pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:18:31.319091 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.319073 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lvsk\" (UniqueName: \"kubernetes.io/projected/0d72ac3d-027c-4ef4-afed-2c030a463968-kube-api-access-6lvsk\") pod \"must-gather-hf492\" (UID: \"0d72ac3d-027c-4ef4-afed-2c030a463968\") " pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:18:31.426447 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.426427 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:18:31.542490 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.542469 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zvh94/must-gather-hf492"] Apr 23 08:18:31.545162 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:18:31.545132 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d72ac3d_027c_4ef4_afed_2c030a463968.slice/crio-fcdd18dd113bc006f1c0fd62495a1135549abd6e94205ccce89915e527e1abce WatchSource:0}: Error finding container fcdd18dd113bc006f1c0fd62495a1135549abd6e94205ccce89915e527e1abce: Status 404 returned error can't find the container with id fcdd18dd113bc006f1c0fd62495a1135549abd6e94205ccce89915e527e1abce Apr 23 08:18:31.654780 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:31.654750 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvh94/must-gather-hf492" event={"ID":"0d72ac3d-027c-4ef4-afed-2c030a463968","Type":"ContainerStarted","Data":"fcdd18dd113bc006f1c0fd62495a1135549abd6e94205ccce89915e527e1abce"} Apr 23 08:18:36.675498 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:36.675464 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvh94/must-gather-hf492" event={"ID":"0d72ac3d-027c-4ef4-afed-2c030a463968","Type":"ContainerStarted","Data":"7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d"} Apr 23 08:18:37.680577 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:37.680538 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvh94/must-gather-hf492" event={"ID":"0d72ac3d-027c-4ef4-afed-2c030a463968","Type":"ContainerStarted","Data":"e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748"} Apr 23 08:18:37.700038 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:18:37.699975 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zvh94/must-gather-hf492" podStartSLOduration=1.713973829 podStartE2EDuration="6.699960796s" podCreationTimestamp="2026-04-23 08:18:31 +0000 UTC" firstStartedPulling="2026-04-23 08:18:31.546998812 +0000 UTC m=+366.674002629" lastFinishedPulling="2026-04-23 08:18:36.532985792 +0000 UTC m=+371.659989596" observedRunningTime="2026-04-23 08:18:37.698292003 +0000 UTC m=+372.825295831" watchObservedRunningTime="2026-04-23 08:18:37.699960796 +0000 UTC m=+372.826964622" Apr 23 08:19:19.809140 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:19.809106 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerID="7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d" exitCode=0 Apr 23 08:19:19.809539 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:19.809187 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvh94/must-gather-hf492" event={"ID":"0d72ac3d-027c-4ef4-afed-2c030a463968","Type":"ContainerDied","Data":"7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d"} Apr 23 08:19:19.809539 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:19.809494 2579 scope.go:117] "RemoveContainer" containerID="7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d" Apr 23 08:19:19.956298 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:19.956268 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvh94_must-gather-hf492_0d72ac3d-027c-4ef4-afed-2c030a463968/gather/0.log" Apr 23 08:19:23.149676 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:23.149649 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x8j9k_98eee7d8-bae0-438a-887a-8591437a310c/global-pull-secret-syncer/0.log" Apr 23 08:19:23.219666 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:23.219618 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7mwd7_65461c01-4562-4d4b-86d7-6491c2bd2b8c/konnectivity-agent/0.log" Apr 23 08:19:23.312076 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:23.312032 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-255.ec2.internal_e688fec9147a531ae0f3ba981a4ec304/haproxy/0.log" Apr 23 08:19:25.258699 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.258664 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zvh94/must-gather-hf492"] Apr 23 08:19:25.259201 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.258900 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-zvh94/must-gather-hf492" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerName="copy" containerID="cri-o://e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748" gracePeriod=2 Apr 23 08:19:25.264011 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.263983 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zvh94/must-gather-hf492"] Apr 23 08:19:25.488212 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.488190 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvh94_must-gather-hf492_0d72ac3d-027c-4ef4-afed-2c030a463968/copy/0.log" Apr 23 08:19:25.488535 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.488520 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:19:25.490525 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.490491 2579 status_manager.go:895] "Failed to get status for pod" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" pod="openshift-must-gather-zvh94/must-gather-hf492" err="pods \"must-gather-hf492\" is forbidden: User \"system:node:ip-10-0-142-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-zvh94\": no relationship found between node 'ip-10-0-142-255.ec2.internal' and this object" Apr 23 08:19:25.534901 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.534836 2579 status_manager.go:895] "Failed to get status for pod" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" pod="openshift-must-gather-zvh94/must-gather-hf492" err="pods \"must-gather-hf492\" is forbidden: User \"system:node:ip-10-0-142-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-zvh94\": no relationship found between node 'ip-10-0-142-255.ec2.internal' and this object" Apr 23 08:19:25.682663 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.682623 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lvsk\" (UniqueName: \"kubernetes.io/projected/0d72ac3d-027c-4ef4-afed-2c030a463968-kube-api-access-6lvsk\") pod \"0d72ac3d-027c-4ef4-afed-2c030a463968\" (UID: \"0d72ac3d-027c-4ef4-afed-2c030a463968\") " Apr 23 08:19:25.682820 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.682679 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d72ac3d-027c-4ef4-afed-2c030a463968-must-gather-output\") pod \"0d72ac3d-027c-4ef4-afed-2c030a463968\" (UID: \"0d72ac3d-027c-4ef4-afed-2c030a463968\") " Apr 23 08:19:25.683927 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.683900 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d72ac3d-027c-4ef4-afed-2c030a463968-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0d72ac3d-027c-4ef4-afed-2c030a463968" (UID: "0d72ac3d-027c-4ef4-afed-2c030a463968"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:19:25.684913 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.684880 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d72ac3d-027c-4ef4-afed-2c030a463968-kube-api-access-6lvsk" (OuterVolumeSpecName: "kube-api-access-6lvsk") pod "0d72ac3d-027c-4ef4-afed-2c030a463968" (UID: "0d72ac3d-027c-4ef4-afed-2c030a463968"). InnerVolumeSpecName "kube-api-access-6lvsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:19:25.783647 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.783622 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6lvsk\" (UniqueName: \"kubernetes.io/projected/0d72ac3d-027c-4ef4-afed-2c030a463968-kube-api-access-6lvsk\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:19:25.783647 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.783647 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d72ac3d-027c-4ef4-afed-2c030a463968-must-gather-output\") on node \"ip-10-0-142-255.ec2.internal\" DevicePath \"\"" Apr 23 08:19:25.827121 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.827048 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvh94_must-gather-hf492_0d72ac3d-027c-4ef4-afed-2c030a463968/copy/0.log" Apr 23 08:19:25.827407 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.827382 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerID="e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748" exitCode=143 Apr 23 08:19:25.827468 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.827437 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvh94/must-gather-hf492" Apr 23 08:19:25.827525 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.827491 2579 scope.go:117] "RemoveContainer" containerID="e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748" Apr 23 08:19:25.836997 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.836976 2579 scope.go:117] "RemoveContainer" containerID="7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d" Apr 23 08:19:25.848143 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.848117 2579 scope.go:117] "RemoveContainer" containerID="e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748" Apr 23 08:19:25.848364 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:19:25.848346 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748\": container with ID starting with e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748 not found: ID does not exist" containerID="e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748" Apr 23 08:19:25.848409 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.848375 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748"} err="failed to get container status \"e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748\": rpc error: code = NotFound desc = could not find container \"e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748\": container with ID starting with e498c98682508a115360ba4d2a7d146f290cc20d5c4321b931e6c4a803aea748 not found: ID does not exist" Apr 23 08:19:25.848409 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.848393 2579 scope.go:117] "RemoveContainer" containerID="7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d" Apr 23 08:19:25.848585 ip-10-0-142-255 kubenswrapper[2579]: E0423 08:19:25.848569 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d\": container with ID starting with 7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d not found: ID does not exist" containerID="7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d" Apr 23 08:19:25.848628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:25.848590 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d"} err="failed to get container status \"7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d\": rpc error: code = NotFound desc = could not find container \"7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d\": container with ID starting with 7b5124faaedd9088d619ea349dce34aced45d2cad51922944163cb2969d6dc7d not found: ID does not exist" Apr 23 08:19:26.495499 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.495470 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-xvxq4_1ce3ff11-bba5-4c93-b9a5-9da3915e6c2a/cluster-monitoring-operator/0.log" Apr 23 08:19:26.517583 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.517555 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nrm4q_7cf1f6b7-9b17-42b9-808b-1e95514cc372/kube-state-metrics/0.log" Apr 23 08:19:26.537241 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.537214 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nrm4q_7cf1f6b7-9b17-42b9-808b-1e95514cc372/kube-rbac-proxy-main/0.log" Apr 23 08:19:26.563183 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.563158 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nrm4q_7cf1f6b7-9b17-42b9-808b-1e95514cc372/kube-rbac-proxy-self/0.log" Apr 23 08:19:26.612215 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.612192 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-hjrx2_769527c1-f7ac-483c-9438-d30160a91d00/monitoring-plugin/0.log" Apr 23 08:19:26.710346 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.710315 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q6vg7_e93d401d-bbd4-4bac-856d-242ed681ac6e/node-exporter/0.log" Apr 23 08:19:26.728995 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.728975 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q6vg7_e93d401d-bbd4-4bac-856d-242ed681ac6e/kube-rbac-proxy/0.log" Apr 23 08:19:26.749771 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.749716 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q6vg7_e93d401d-bbd4-4bac-856d-242ed681ac6e/init-textfile/0.log" Apr 23 08:19:26.846628 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.846612 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-29k85_11528c38-6db1-493e-b785-5d76aed4e2e9/kube-rbac-proxy-main/0.log" Apr 23 08:19:26.867417 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.867390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-29k85_11528c38-6db1-493e-b785-5d76aed4e2e9/kube-rbac-proxy-self/0.log" Apr 23 08:19:26.886701 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:26.886679 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-29k85_11528c38-6db1-493e-b785-5d76aed4e2e9/openshift-state-metrics/0.log" Apr 23 08:19:27.160869 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.160845 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d5bbd96d6-ql9h9_1f14cf83-8b6a-404f-bb4f-02f6a51d1596/telemeter-client/0.log" Apr 23 08:19:27.180692 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.180672 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d5bbd96d6-ql9h9_1f14cf83-8b6a-404f-bb4f-02f6a51d1596/reload/0.log" Apr 23 08:19:27.203490 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.203475 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d5bbd96d6-ql9h9_1f14cf83-8b6a-404f-bb4f-02f6a51d1596/kube-rbac-proxy/0.log" Apr 23 08:19:27.240448 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.240426 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8499945ff4-znpmj_a7b0d765-1bbf-4aee-a093-b38a04ab37a7/thanos-query/0.log" Apr 23 08:19:27.271368 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.271345 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8499945ff4-znpmj_a7b0d765-1bbf-4aee-a093-b38a04ab37a7/kube-rbac-proxy-web/0.log" Apr 23 08:19:27.296530 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.296509 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8499945ff4-znpmj_a7b0d765-1bbf-4aee-a093-b38a04ab37a7/kube-rbac-proxy/0.log" Apr 23 08:19:27.319191 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.319168 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8499945ff4-znpmj_a7b0d765-1bbf-4aee-a093-b38a04ab37a7/prom-label-proxy/0.log" Apr 23 08:19:27.352000 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.351983 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8499945ff4-znpmj_a7b0d765-1bbf-4aee-a093-b38a04ab37a7/kube-rbac-proxy-rules/0.log" Apr 23 08:19:27.379090 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.379052 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8499945ff4-znpmj_a7b0d765-1bbf-4aee-a093-b38a04ab37a7/kube-rbac-proxy-metrics/0.log" Apr 23 08:19:27.534930 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:27.534901 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" path="/var/lib/kubelet/pods/0d72ac3d-027c-4ef4-afed-2c030a463968/volumes" Apr 23 08:19:29.084755 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.084721 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56c649c4cb-vbvbb_cb4e65b0-e6a4-460a-b7cd-342ac8714a2b/console/0.log" Apr 23 08:19:29.109986 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.109961 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-jx5bv_41b3d83e-0d9c-4724-b970-23cf0ef08d5a/download-server/0.log" Apr 23 08:19:29.445402 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.445374 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r"] Apr 23 08:19:29.445712 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.445699 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerName="copy" Apr 23 08:19:29.445755 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.445714 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerName="copy" Apr 23 08:19:29.445755 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.445731 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerName="gather" Apr 23 08:19:29.445755 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.445736 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerName="gather" Apr 23 08:19:29.445848 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.445793 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerName="gather" Apr 23 08:19:29.445848 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.445801 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d72ac3d-027c-4ef4-afed-2c030a463968" containerName="copy" Apr 23 08:19:29.450767 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.450748 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.453419 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.453398 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x67hq\"/\"default-dockercfg-kgww6\"" Apr 23 08:19:29.454374 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.454351 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x67hq\"/\"kube-root-ca.crt\"" Apr 23 08:19:29.454455 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.454358 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x67hq\"/\"openshift-service-ca.crt\"" Apr 23 08:19:29.457835 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.457815 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r"] Apr 23 08:19:29.512209 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.512185 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-lib-modules\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.512341 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.512215 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67tgt\" (UniqueName: \"kubernetes.io/projected/37183cfe-5fa4-495c-8ac7-2321a4159f09-kube-api-access-67tgt\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.512341 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.512250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-sys\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.512341 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.512283 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-proc\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.512440 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.512358 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-podres\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613484 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613458 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-lib-modules\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613634 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67tgt\" (UniqueName: \"kubernetes.io/projected/37183cfe-5fa4-495c-8ac7-2321a4159f09-kube-api-access-67tgt\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613634 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-sys\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613634 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-proc\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613634 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613599 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-podres\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613634 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-lib-modules\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613851 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-proc\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613851 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-sys\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.613851 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.613718 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37183cfe-5fa4-495c-8ac7-2321a4159f09-podres\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.621400 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.621371 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67tgt\" (UniqueName: \"kubernetes.io/projected/37183cfe-5fa4-495c-8ac7-2321a4159f09-kube-api-access-67tgt\") pod \"perf-node-gather-daemonset-llk2r\" (UID: \"37183cfe-5fa4-495c-8ac7-2321a4159f09\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.761829 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.761757 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:29.876820 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:29.876791 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r"] Apr 23 08:19:29.880488 ip-10-0-142-255 kubenswrapper[2579]: W0423 08:19:29.880454 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod37183cfe_5fa4_495c_8ac7_2321a4159f09.slice/crio-7bed51d2e27d582dd080354721e4f3b65796ebb9563d6674f5aa841895c30c0a WatchSource:0}: Error finding container 7bed51d2e27d582dd080354721e4f3b65796ebb9563d6674f5aa841895c30c0a: Status 404 returned error can't find the container with id 7bed51d2e27d582dd080354721e4f3b65796ebb9563d6674f5aa841895c30c0a Apr 23 08:19:30.145177 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:30.145146 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s2qrh_b1a03cf0-a55d-4e77-9ca0-a33f942b3b24/dns/0.log" Apr 23 08:19:30.164034 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:30.164011 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s2qrh_b1a03cf0-a55d-4e77-9ca0-a33f942b3b24/kube-rbac-proxy/0.log" Apr 23 08:19:30.205898 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:30.205873 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hnjm9_cf4bc19e-bb95-4e2e-9978-4a53f064696d/dns-node-resolver/0.log" Apr 23 08:19:30.665988 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:30.665952 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9b2bf_10f6f586-998e-4725-bb51-e801aba526fe/node-ca/0.log" Apr 23 08:19:30.844638 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:30.844606 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" event={"ID":"37183cfe-5fa4-495c-8ac7-2321a4159f09","Type":"ContainerStarted","Data":"3b9b7c83aa5ccceaec7af9dde545a66d4dde0ae0ad6426b262967a6e264b5e30"} Apr 23 08:19:30.844638 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:30.844641 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" event={"ID":"37183cfe-5fa4-495c-8ac7-2321a4159f09","Type":"ContainerStarted","Data":"7bed51d2e27d582dd080354721e4f3b65796ebb9563d6674f5aa841895c30c0a"} Apr 23 08:19:30.844853 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:30.844740 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:30.861097 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:30.861033 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" podStartSLOduration=1.861017913 podStartE2EDuration="1.861017913s" podCreationTimestamp="2026-04-23 08:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:19:30.859601493 +0000 UTC m=+425.986605321" watchObservedRunningTime="2026-04-23 08:19:30.861017913 +0000 UTC m=+425.988021757" Apr 23 08:19:31.322485 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:31.322455 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-64566ff68-kfxkd_8d015a9d-342c-4585-8306-d78b5774129d/router/0.log" Apr 23 08:19:31.634711 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:31.634667 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8969w_0721f5e7-8efa-45d5-a74c-776e204f81c6/serve-healthcheck-canary/0.log" Apr 23 08:19:31.995840 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:31.995759 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-n42s5_1a2fda67-d89f-4fa0-a61a-91dfc33b57fd/insights-operator/1.log" Apr 23 08:19:31.995973 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:31.995911 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-n42s5_1a2fda67-d89f-4fa0-a61a-91dfc33b57fd/insights-operator/0.log" Apr 23 08:19:32.015881 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:32.015855 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5kw49_8800717e-dec3-48fd-8358-621330743c4c/kube-rbac-proxy/0.log" Apr 23 08:19:32.036568 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:32.036548 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5kw49_8800717e-dec3-48fd-8358-621330743c4c/exporter/0.log" Apr 23 08:19:32.057303 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:32.057284 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5kw49_8800717e-dec3-48fd-8358-621330743c4c/extractor/0.log" Apr 23 08:19:35.896174 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:35.896135 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xxn8x_88352a02-c1be-4e1c-9b7a-9e7009f4911c/migrator/0.log" Apr 23 08:19:35.914601 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:35.914573 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xxn8x_88352a02-c1be-4e1c-9b7a-9e7009f4911c/graceful-termination/0.log" Apr 23 08:19:36.857556 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:36.857529 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-llk2r" Apr 23 08:19:37.238857 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.238834 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h756m_325bc232-cf8c-46f7-a278-679124fa4e09/kube-multus-additional-cni-plugins/0.log" Apr 23 08:19:37.259535 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.259511 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h756m_325bc232-cf8c-46f7-a278-679124fa4e09/egress-router-binary-copy/0.log" Apr 23 08:19:37.281291 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.281267 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h756m_325bc232-cf8c-46f7-a278-679124fa4e09/cni-plugins/0.log" Apr 23 08:19:37.300019 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.299993 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h756m_325bc232-cf8c-46f7-a278-679124fa4e09/bond-cni-plugin/0.log" Apr 23 08:19:37.319618 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.319597 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h756m_325bc232-cf8c-46f7-a278-679124fa4e09/routeoverride-cni/0.log" Apr 23 08:19:37.338167 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.338137 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h756m_325bc232-cf8c-46f7-a278-679124fa4e09/whereabouts-cni-bincopy/0.log" Apr 23 08:19:37.358358 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.358338 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h756m_325bc232-cf8c-46f7-a278-679124fa4e09/whereabouts-cni/0.log" Apr 23 08:19:37.387497 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.387474 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fj94f_015aecae-ba6f-4d84-946d-58733117d34f/kube-multus/0.log" Apr 23 08:19:37.457567 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.457547 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d5mzc_41fa5042-9289-494c-9973-953c5146e01c/network-metrics-daemon/0.log" Apr 23 08:19:37.514835 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:37.514739 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d5mzc_41fa5042-9289-494c-9973-953c5146e01c/kube-rbac-proxy/0.log" Apr 23 08:19:38.991840 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:38.991811 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-controller/0.log" Apr 23 08:19:39.015409 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:39.015373 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-acl-logging/0.log" Apr 23 08:19:39.019175 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:39.019153 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovn-acl-logging/1.log" Apr 23 08:19:39.038933 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:39.038880 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/kube-rbac-proxy-node/0.log" Apr 23 08:19:39.059120 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:39.059100 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 08:19:39.075174 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:39.075154 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/northd/0.log" Apr 23 08:19:39.093850 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:39.093827 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/nbdb/0.log" Apr 23 08:19:39.118124 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:39.118086 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/sbdb/0.log" Apr 23 08:19:39.266726 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:39.266701 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svpkl_ec9efb36-5e54-40e4-9ff4-f25ef8172507/ovnkube-controller/0.log" Apr 23 08:19:40.209001 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:40.208970 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-c4bd2_7aedc835-f4c8-4970-ba3b-30679f2aa6e9/network-check-target-container/0.log" Apr 23 08:19:41.060760 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:41.060705 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-ddlvz_a63e511b-0189-48fb-bcb8-1878f4bec538/iptables-alerter/0.log" Apr 23 08:19:41.669578 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:41.669553 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bbc8p_be039d04-f58e-488c-9676-348a83fcc83e/tuned/0.log" Apr 23 08:19:43.206449 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:43.206416 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-k7xpw_7318baba-3bf1-4121-9683-c65da37bc5ee/cluster-samples-operator/0.log" Apr 23 08:19:43.223450 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:43.223425 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-k7xpw_7318baba-3bf1-4121-9683-c65da37bc5ee/cluster-samples-operator-watch/0.log" Apr 23 08:19:44.817815 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:44.817733 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zltrc_3f1e47b2-e320-437a-b287-7a3cb3b8613f/csi-driver/0.log" Apr 23 08:19:44.838647 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:44.838596 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zltrc_3f1e47b2-e320-437a-b287-7a3cb3b8613f/csi-node-driver-registrar/0.log" Apr 23 08:19:44.856945 ip-10-0-142-255 kubenswrapper[2579]: I0423 08:19:44.856921 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zltrc_3f1e47b2-e320-437a-b287-7a3cb3b8613f/csi-liveness-probe/0.log"