Apr 24 22:27:06.146269 ip-10-0-133-73 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 22:27:06.146281 ip-10-0-133-73 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 22:27:06.146288 ip-10-0-133-73 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 22:27:06.146530 ip-10-0-133-73 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 22:27:16.253715 ip-10-0-133-73 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 22:27:16.253736 ip-10-0-133-73 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 19b2b54d67a449078b1856e292462e1f -- Apr 24 22:29:24.356102 ip-10-0-133-73 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:29:24.865267 ip-10-0-133-73 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:24.865267 ip-10-0-133-73 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:29:24.865267 ip-10-0-133-73 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:24.865267 ip-10-0-133-73 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:29:24.865267 ip-10-0-133-73 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:24.867234 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.867145 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:29:24.871002 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.870987 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:24.871002 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871003 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871006 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871009 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871012 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871015 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871018 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871021 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871024 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871029 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871033 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871036 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871039 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871042 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871045 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871048 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871055 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871058 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871061 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871064 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871067 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:24.871062 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871070 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871073 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871077 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871081 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871085 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871088 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871090 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871093 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871095 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871098 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871100 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871104 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871107 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871109 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871112 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871115 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871117 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871120 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871123 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:24.871527 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871125 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871128 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871130 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871133 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871135 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871138 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871140 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871143 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871146 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871148 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871151 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871153 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871156 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871158 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871161 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871166 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871169 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871171 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871174 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871177 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:24.872008 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871179 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871182 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871185 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871187 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871190 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871193 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871195 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871197 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871200 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871203 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871206 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871208 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871212 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871215 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871217 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871220 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871223 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871226 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871228 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871231 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:24.872504 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871233 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871236 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871238 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871241 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871243 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871246 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871624 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871630 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871633 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871637 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871641 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871645 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871647 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871650 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871652 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871655 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871658 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871660 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871663 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:24.873021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871666 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871668 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871671 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871673 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871676 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871679 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871682 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871684 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871687 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871689 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871692 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871694 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871697 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871699 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871702 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871704 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871707 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871709 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871711 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871714 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:24.873482 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871718 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871721 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871723 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871726 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871728 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871731 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871733 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871736 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871738 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871741 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871743 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871747 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871749 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871752 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871754 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871757 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871759 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871762 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871765 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871767 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:24.874021 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871770 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871773 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871775 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871778 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871781 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871784 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871786 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871788 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871791 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871793 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871796 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871799 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871817 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871820 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871824 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871827 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871830 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871832 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871835 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871838 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:24.874509 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871840 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871843 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871845 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871848 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871850 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871853 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871857 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871861 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871864 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871866 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871869 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871875 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.871878 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873789 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873799 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873817 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873822 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873827 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873830 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873835 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873840 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:29:24.875037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873843 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873847 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873850 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873853 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873856 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873860 2582 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873862 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873866 2582 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873869 2582 flags.go:64] FLAG: --cloud-config="" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873872 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873875 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873880 2582 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873882 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873886 2582 flags.go:64] FLAG: --config-dir="" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873889 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873892 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873896 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873899 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873903 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873906 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873909 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873912 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873915 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873919 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873922 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:29:24.875536 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873927 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873931 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873934 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873937 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873940 2582 flags.go:64] FLAG: --enable-server="true" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873943 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873947 2582 flags.go:64] FLAG: --event-burst="100" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873950 2582 flags.go:64] FLAG: --event-qps="50" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873953 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873956 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873959 2582 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873963 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873966 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873969 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873973 2582 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873975 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873978 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873982 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873985 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873988 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873991 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873994 2582 flags.go:64] FLAG: --feature-gates="" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.873998 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874001 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874004 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:29:24.876154 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874007 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874010 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874013 2582 flags.go:64] FLAG: --help="false" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874016 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-133-73.ec2.internal" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874019 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874022 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874026 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874030 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874034 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874037 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874040 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874043 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874046 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874049 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874052 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874055 2582 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874058 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874061 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874064 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874067 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874070 2582 flags.go:64] FLAG: --lock-file="" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874073 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874076 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874079 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:29:24.876734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874085 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874088 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874091 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874094 2582 flags.go:64] FLAG: --logging-format="text" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874097 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874100 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874103 2582 flags.go:64] FLAG: --manifest-url="" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874108 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874112 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874115 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874119 2582 flags.go:64] FLAG: --max-pods="110" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874122 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874125 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874128 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874131 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874135 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874138 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874141 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874149 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874152 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874155 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874158 2582 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874161 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:29:24.877334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874167 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874170 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874173 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874176 2582 flags.go:64] FLAG: --port="10250" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874179 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874182 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a4697974bde9e6a2" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874185 2582 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874188 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874191 2582 flags.go:64] FLAG: --register-node="true" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874194 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874197 2582 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874200 2582 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874203 2582 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874206 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874209 2582 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874213 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874220 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874223 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874226 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874229 2582 flags.go:64] FLAG: --runonce="false" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874232 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874235 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874238 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874241 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874244 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874250 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:29:24.877886 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874254 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874257 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874261 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874264 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874266 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874270 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874273 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874276 2582 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874278 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874284 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874287 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874289 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874294 2582 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874297 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874299 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874302 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874305 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874308 2582 flags.go:64] FLAG: --v="2" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874313 2582 flags.go:64] FLAG: --version="false" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874317 2582 flags.go:64] FLAG: --vmodule="" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874322 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.874325 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875343 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875349 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:24.878506 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875353 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875356 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875359 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875362 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875365 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875368 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875371 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875376 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875380 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875383 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875386 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875389 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875392 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875395 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875397 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875401 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875403 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875406 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875409 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875412 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:24.879073 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875414 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875416 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875419 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875421 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875425 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875430 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875432 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875435 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875437 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875440 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875443 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875446 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875448 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875451 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875453 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875456 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875460 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875464 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:24.879615 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875466 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875471 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875474 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875478 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875481 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875484 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875487 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875490 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875492 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875495 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875497 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875500 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875503 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875505 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875508 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875510 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875513 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875516 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875518 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875520 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:24.880093 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875523 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875526 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875528 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875531 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875534 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875536 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875539 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875542 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875545 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875548 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875551 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875553 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875556 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875560 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875562 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875565 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875568 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875570 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875573 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875575 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:24.880572 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875578 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:24.881113 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875580 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:24.881113 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875583 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:24.881113 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875585 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:24.881113 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875588 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:24.881113 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.875591 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:24.881113 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.876387 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:24.883167 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.883148 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:29:24.883167 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.883165 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883220 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883225 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883228 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883232 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883235 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883238 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883241 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883244 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883247 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883250 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883252 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883255 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883258 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883260 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883263 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883266 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883268 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883273 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:24.883280 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883277 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883281 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883284 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883287 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883290 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883293 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883296 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883299 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883301 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883304 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883306 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883309 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883311 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883314 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883317 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883319 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883322 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883325 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883327 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883330 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:24.883740 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883332 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883335 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883337 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883340 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883342 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883345 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883347 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883350 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883352 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883355 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883357 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883360 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883362 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883365 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883369 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883371 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883374 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883377 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883381 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:24.884309 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883384 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883387 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883389 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883392 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883394 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883397 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883399 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883402 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883404 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883407 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883409 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883412 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883414 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883417 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883420 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883422 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883425 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883427 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883430 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883433 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:24.884870 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883435 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883438 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883440 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883443 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883445 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883448 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883450 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883453 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883456 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.883461 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883558 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883563 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883568 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883572 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883575 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:24.885363 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883578 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883581 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883583 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883586 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883588 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883591 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883593 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883596 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883598 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883600 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883603 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883606 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883608 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883611 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883613 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883616 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883618 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883621 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883623 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883626 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:24.885730 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883628 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883631 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883633 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883636 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883638 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883641 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883644 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883647 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883649 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883652 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883654 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883657 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883659 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883661 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883664 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883666 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883669 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883671 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883674 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883676 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:24.886273 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883679 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883681 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883684 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883686 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883689 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883692 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883694 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883697 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883700 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883702 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883705 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883707 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883710 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883712 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883714 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883717 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883719 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883722 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883725 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883728 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:24.886753 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883730 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883733 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883735 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883738 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883740 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883742 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883746 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883750 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883753 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883756 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883758 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883761 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883764 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883766 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883769 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883771 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883774 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883776 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883779 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:24.887256 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883782 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:24.887699 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:24.883784 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:24.887699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.883789 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:24.887699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.884694 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:29:24.888002 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.887988 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:29:24.889099 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.889087 2582 server.go:1019] "Starting client certificate rotation" Apr 24 22:29:24.889210 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.889192 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:24.889247 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.889231 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:24.918410 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.918388 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:24.922081 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.922061 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:24.937676 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.937652 2582 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:29:24.944742 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.944723 2582 log.go:25] "Validated CRI v1 image API" Apr 24 22:29:24.947170 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.947149 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:29:24.949822 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.949777 2582 fs.go:135] Filesystem UUIDs: map[5930956e-e514-4f38-9d48-1f4ad4ebe720:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 afa511f6-54ea-4a30-86c7-41d2ebb0a06e:/dev/nvme0n1p3] Apr 24 22:29:24.949897 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.949799 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:29:24.953163 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.953138 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:24.956848 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.956705 2582 manager.go:217] Machine: {Timestamp:2026-04-24 22:29:24.953690671 +0000 UTC m=+0.464217867 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099410 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a821b261adc764d7394224ded90a7 SystemUUID:ec2a821b-261a-dc76-4d73-94224ded90a7 BootID:19b2b54d-67a4-4907-8b18-56e292462e1f Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4e:0f:9e:63:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4e:0f:9e:63:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:96:de:22:8b:76 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:29:24.956848 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.956842 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:29:24.957006 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.956964 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:29:24.959775 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.959749 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:29:24.959968 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.959778 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-73.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:29:24.960051 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.959983 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:29:24.960051 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.959997 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:29:24.960051 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.960015 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:24.962586 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.962573 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:24.964543 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.964531 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:24.964681 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.964670 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:29:24.967683 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.967671 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:29:24.967748 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.967696 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:29:24.967748 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.967717 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:29:24.967748 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.967736 2582 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:29:24.967748 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.967747 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:29:24.969067 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.969054 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:24.969140 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.969077 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:24.972601 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.972587 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:29:24.974312 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.974298 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:29:24.976485 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976473 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:29:24.976538 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976490 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:29:24.976538 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976498 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:29:24.976538 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976503 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:29:24.976538 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976510 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:29:24.976538 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976517 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:29:24.976538 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976537 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:29:24.976700 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976544 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:29:24.976700 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976550 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:29:24.976700 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976556 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:29:24.976700 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976569 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:29:24.976700 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.976578 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:29:24.977542 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.977529 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:29:24.977542 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.977543 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:29:24.980430 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:24.980405 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-73.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:29:24.980498 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:24.980429 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:29:24.980549 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.980540 2582 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-73.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:29:24.981626 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.981614 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:29:24.981678 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.981652 2582 server.go:1295] "Started kubelet" Apr 24 22:29:24.981770 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.981748 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:29:24.981770 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.981733 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:29:24.981858 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.981799 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:29:24.982415 ip-10-0-133-73 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:29:24.986687 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.986665 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:29:24.987334 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.987316 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:29:24.990781 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.990761 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8gxbz" Apr 24 22:29:24.993233 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.993209 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:24.993874 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.993849 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:29:24.996266 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.996247 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:29:24.996375 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.996363 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:29:24.996530 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.996511 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:29:24.996651 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.996631 2582 factory.go:55] Registering systemd factory Apr 24 22:29:24.996651 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.996247 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:29:24.996651 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.996590 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:29:24.996651 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:24.996604 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:24.996830 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.996657 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:29:24.996830 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.996646 2582 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:29:24.997380 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.997353 2582 factory.go:153] Registering CRI-O factory Apr 24 22:29:24.997380 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.997372 2582 factory.go:223] Registration of the crio container factory successfully Apr 24 22:29:24.997509 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.997389 2582 factory.go:103] Registering Raw factory Apr 24 22:29:24.997509 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.997401 2582 manager.go:1196] Started watching for new ooms in manager Apr 24 22:29:24.997985 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:24.997766 2582 manager.go:319] Starting recovery of all containers Apr 24 22:29:24.998362 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:24.998331 2582 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:29:24.999986 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:24.999959 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 22:29:25.000409 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.000390 2582 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-73.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 22:29:25.000493 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.000474 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8gxbz" Apr 24 22:29:25.001312 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:24.999998 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-73.ec2.internal.18a96b8932d11888 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-73.ec2.internal,UID:ip-10-0-133-73.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-73.ec2.internal,},FirstTimestamp:2026-04-24 22:29:24.981627016 +0000 UTC m=+0.492154206,LastTimestamp:2026-04-24 22:29:24.981627016 +0000 UTC m=+0.492154206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-73.ec2.internal,}" Apr 24 22:29:25.006447 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.006422 2582 manager.go:324] Recovery completed Apr 24 22:29:25.011368 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.011353 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:25.015028 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.015014 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:25.015087 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.015043 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:25.015087 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.015056 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:25.015643 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.015621 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:29:25.015643 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.015641 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:29:25.015730 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.015673 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:25.019343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.019330 2582 policy_none.go:49] "None policy: Start" Apr 24 22:29:25.019343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.019346 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:29:25.019449 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.019358 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.058348 2582 manager.go:341] "Starting Device Plugin manager" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.058378 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.058389 2582 server.go:85] "Starting device plugin registration server" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.058682 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.058696 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.058778 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.058879 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.058889 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.059473 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:29:25.077773 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.059505 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.097951 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.097919 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:29:25.099213 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.099191 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:29:25.099305 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.099219 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:29:25.099305 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.099237 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:29:25.099305 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.099243 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:29:25.099305 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.099274 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:29:25.102158 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.102134 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:25.159698 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.159616 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:25.160766 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.160738 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:25.160898 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.160779 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:25.160898 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.160794 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:25.160898 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.160839 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.177644 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.177627 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.177689 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.177649 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-73.ec2.internal\": node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.199413 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.199392 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal"] Apr 24 22:29:25.199491 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.199449 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:25.200199 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.200179 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:25.200288 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.200208 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:25.200288 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.200219 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:25.202464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.202452 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:25.202615 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.202600 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.202651 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.202639 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:25.203041 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.203023 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.203151 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.203126 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:25.203151 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.203149 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:25.203230 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.203165 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:25.203230 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.203128 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:25.203230 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.203193 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:25.203230 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.203205 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:25.205395 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.205379 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.205483 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.205410 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:25.206108 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.206094 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:25.206171 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.206115 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:25.206171 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.206124 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:25.232402 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.232377 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-73.ec2.internal\" not found" node="ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.236665 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.236648 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-73.ec2.internal\" not found" node="ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.298855 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.298830 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d74e5b7f3ca859862ed6413284694748-config\") pod \"kube-apiserver-proxy-ip-10-0-133-73.ec2.internal\" (UID: \"d74e5b7f3ca859862ed6413284694748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.298855 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.298857 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.299007 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.298876 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.303623 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.303605 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.399221 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.399195 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.399317 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.399229 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.399317 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.399254 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d74e5b7f3ca859862ed6413284694748-config\") pod \"kube-apiserver-proxy-ip-10-0-133-73.ec2.internal\" (UID: \"d74e5b7f3ca859862ed6413284694748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.399317 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.399290 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.399317 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.399298 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.399317 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.399292 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d74e5b7f3ca859862ed6413284694748-config\") pod \"kube-apiserver-proxy-ip-10-0-133-73.ec2.internal\" (UID: \"d74e5b7f3ca859862ed6413284694748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.404306 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.404287 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.505242 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.505171 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.534371 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.534346 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.538844 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.538828 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 22:29:25.605825 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.605763 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.706411 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.706391 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.807043 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.806978 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.889244 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.889204 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:29:25.889921 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.889354 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:25.907778 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:25.907743 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:25.993352 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:25.993322 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:26.003781 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.003737 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:24:24 +0000 UTC" deadline="2027-12-14 17:24:38.194724819 +0000 UTC" Apr 24 22:29:26.003781 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.003770 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14370h55m12.190957919s" Apr 24 22:29:26.003944 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.003847 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:26.007929 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:26.007903 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:26.025820 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.025781 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v582w" Apr 24 22:29:26.027242 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.027228 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:26.031563 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.031544 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v582w" Apr 24 22:29:26.036757 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.036741 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:26.096122 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:26.096089 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fe24b0efea27e3f5eca2d71913b3e7.slice/crio-55e82040a9e7401833799cd6d317b1a51174acae1fd11288673d8471aade130c WatchSource:0}: Error finding container 55e82040a9e7401833799cd6d317b1a51174acae1fd11288673d8471aade130c: Status 404 returned error can't find the container with id 55e82040a9e7401833799cd6d317b1a51174acae1fd11288673d8471aade130c Apr 24 22:29:26.096427 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:26.096404 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74e5b7f3ca859862ed6413284694748.slice/crio-73c7c327741dcff5632cecde4fce4383afc5ecd2c4c6242c6654176d78c70582 WatchSource:0}: Error finding container 73c7c327741dcff5632cecde4fce4383afc5ecd2c4c6242c6654176d78c70582: Status 404 returned error can't find the container with id 73c7c327741dcff5632cecde4fce4383afc5ecd2c4c6242c6654176d78c70582 Apr 24 22:29:26.101942 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.101866 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:26.103502 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.103462 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" event={"ID":"48fe24b0efea27e3f5eca2d71913b3e7","Type":"ContainerStarted","Data":"55e82040a9e7401833799cd6d317b1a51174acae1fd11288673d8471aade130c"} Apr 24 22:29:26.105084 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.105061 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" event={"ID":"d74e5b7f3ca859862ed6413284694748","Type":"ContainerStarted","Data":"73c7c327741dcff5632cecde4fce4383afc5ecd2c4c6242c6654176d78c70582"} Apr 24 22:29:26.108666 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:26.108649 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:26.208906 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:26.208852 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 22:29:26.286545 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.286515 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:26.294607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.294587 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 22:29:26.305357 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.305338 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:26.306605 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.306593 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 22:29:26.319327 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.319308 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:26.733612 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.733521 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:26.968280 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.968244 2582 apiserver.go:52] "Watching apiserver" Apr 24 22:29:26.976723 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.976697 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:29:26.978030 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.978002 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fq7vn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal","openshift-multus/network-metrics-daemon-44r7l","openshift-network-diagnostics/network-check-target-k24sc","openshift-network-operator/iptables-alerter-b6zc7","kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal","openshift-image-registry/node-ca-sw5hp","openshift-multus/multus-additional-cni-plugins-n5rkg","openshift-multus/multus-d8kzk","openshift-ovn-kubernetes/ovnkube-node-4ntzt","kube-system/konnectivity-agent-j5w9d","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc","openshift-cluster-node-tuning-operator/tuned-vcbqm"] Apr 24 22:29:26.982347 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.982325 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:26.985403 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.985349 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:29:26.985538 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.985513 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:29:26.985671 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.985652 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qlvgm\"" Apr 24 22:29:26.985730 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.985708 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:29:26.986592 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.986570 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:26.986689 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:26.986665 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:26.988888 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.988866 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:26.991649 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.991158 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:26.991649 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:26.991221 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:26.991649 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.991263 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:26.991649 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.991457 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:26.991649 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.991507 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tlrrk\"" Apr 24 22:29:26.991975 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.991660 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:26.991975 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.991763 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:29:26.994157 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.993978 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bcxh4\"" Apr 24 22:29:26.994608 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.994443 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:29:26.994702 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.994687 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:29:26.998948 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.998225 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d8kzk" Apr 24 22:29:26.999584 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:26.999092 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.001898 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.000851 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wdlhb\"" Apr 24 22:29:27.001898 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.000939 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:27.003092 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.003068 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:29:27.003397 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.003363 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.005849 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.006095 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.006355 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.006568 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-29drs\"" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.006821 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.007056 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.007067 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.007447 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008287 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-conf-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008324 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-system-cni-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008350 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-cni-bin\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008408 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-tmp-dir\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008442 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-multus-certs\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008474 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-systemd\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008503 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-log-socket\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008531 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008666 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-socket-dir-parent\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008716 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-run-netns\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.008906 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008846 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-env-overrides\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008934 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-ovnkube-script-lib\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.008979 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-cnibin\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009017 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-cni-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009165 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7494b64-7d53-401b-8d5f-fbe58b9bf342-cni-binary-copy\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009231 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-node-log\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009273 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009323 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndw5l\" (UniqueName: \"kubernetes.io/projected/3d1e9be3-c77c-4335-aa84-6e1675c140a1-kube-api-access-ndw5l\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009342 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-system-cni-dir\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009365 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-hosts-file\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009405 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-netns\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009420 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59ba8b67-3c2d-436f-beec-62d19349a64d-ovn-node-metrics-cert\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009438 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb372715-ce4d-476e-881a-eedf339ac388-host\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009488 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009509 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d1e9be3-c77c-4335-aa84-6e1675c140a1-iptables-alerter-script\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009619 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-os-release\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.009828 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009661 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-daemon-config\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009696 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-etc-kubernetes\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009739 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009788 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009923 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.009954 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-cni-bin\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010036 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzdc\" (UniqueName: \"kubernetes.io/projected/fb372715-ce4d-476e-881a-eedf339ac388-kube-api-access-tgzdc\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010097 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010107 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010166 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg782\" (UniqueName: \"kubernetes.io/projected/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-kube-api-access-fg782\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010237 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010331 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-k8s-cni-cncf-io\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010312 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8lpxp\"" Apr 24 22:29:27.010567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010413 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:29:27.011180 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010750 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-kubelet\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.011180 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010844 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-ovnkube-config\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.011180 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.010884 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jffsr\" (UniqueName: \"kubernetes.io/projected/acca45df-62e2-4002-8d37-055685b49029-kube-api-access-jffsr\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.011180 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.011016 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.011180 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.011045 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.011180 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.011086 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-582p9\"" Apr 24 22:29:27.011460 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.011435 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:29:27.011568 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.011544 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-os-release\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.011756 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.011733 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:29:27.012370 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.012346 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-cni-multus\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.013654 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.013634 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-cni-netd\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.013793 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.013776 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.013982 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.013960 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hbw\" (UniqueName: \"kubernetes.io/projected/c19cc309-d892-45ed-a3cd-43a98273bafb-kube-api-access-67hbw\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:27.014143 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014130 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:29:27.014218 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014189 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:29:27.014329 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014309 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:27.014439 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014426 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:29:27.014493 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014459 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:27.014544 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014427 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-systemd-units\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.014544 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014533 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d1e9be3-c77c-4335-aa84-6e1675c140a1-host-slash\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.014645 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014549 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-cnibin\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.014645 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014564 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-kubelet\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.014645 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014582 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7lws\" (UniqueName: \"kubernetes.io/projected/f7494b64-7d53-401b-8d5f-fbe58b9bf342-kube-api-access-v7lws\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.014645 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014592 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:27.014645 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014597 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-ovn\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.014645 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014613 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-slash\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.014645 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014628 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-var-lib-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.014645 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014642 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-etc-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.015047 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014656 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxv5h\" (UniqueName: \"kubernetes.io/projected/59ba8b67-3c2d-436f-beec-62d19349a64d-kube-api-access-rxv5h\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.015047 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014433 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pqpxh\"" Apr 24 22:29:27.015047 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014669 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb372715-ce4d-476e-881a-eedf339ac388-serviceca\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.015047 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014683 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-hostroot\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.015047 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014697 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.015047 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.014378 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j87cq\"" Apr 24 22:29:27.033836 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.032662 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:26 +0000 UTC" deadline="2028-01-26 10:45:02.763231004 +0000 UTC" Apr 24 22:29:27.033836 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.032692 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15396h15m35.730543013s" Apr 24 22:29:27.097864 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.097839 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:29:27.115306 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115273 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-hosts-file\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.115444 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115316 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-netns\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.115444 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115341 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59ba8b67-3c2d-436f-beec-62d19349a64d-ovn-node-metrics-cert\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.115444 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115388 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb372715-ce4d-476e-881a-eedf339ac388-host\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.115444 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115391 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-netns\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.115444 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115412 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:27.115444 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115398 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-hosts-file\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.115444 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115437 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb372715-ce4d-476e-881a-eedf339ac388-host\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.115444 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115443 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t68xt\" (UniqueName: \"kubernetes.io/projected/527aa17b-dc79-48d9-ab47-acf333ccde3f-kube-api-access-t68xt\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.115720 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115483 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d1e9be3-c77c-4335-aa84-6e1675c140a1-iptables-alerter-script\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.115720 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.115537 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:27.115720 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.115613 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs podName:c19cc309-d892-45ed-a3cd-43a98273bafb nodeName:}" failed. No retries permitted until 2026-04-24 22:29:27.61557703 +0000 UTC m=+3.126104208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs") pod "network-metrics-daemon-44r7l" (UID: "c19cc309-d892-45ed-a3cd-43a98273bafb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:27.115836 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115786 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:29:27.115877 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115862 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-os-release\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.115991 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.115971 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-os-release\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.116037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116022 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc95n\" (UniqueName: \"kubernetes.io/projected/0e3a9e21-81c9-47c3-8145-7d69cfec4599-kube-api-access-pc95n\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.116085 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116055 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-daemon-config\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.116119 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116080 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-etc-kubernetes\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.116246 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116117 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116246 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116155 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116246 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116156 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-etc-kubernetes\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.116246 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116176 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-cni-bin\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116246 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116195 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzdc\" (UniqueName: \"kubernetes.io/projected/fb372715-ce4d-476e-881a-eedf339ac388-kube-api-access-tgzdc\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.116246 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116207 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116246 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116232 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116256 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg782\" (UniqueName: \"kubernetes.io/projected/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-kube-api-access-fg782\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116277 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d1e9be3-c77c-4335-aa84-6e1675c140a1-iptables-alerter-script\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116283 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-k8s-cni-cncf-io\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116292 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116305 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-kubelet\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116256 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-cni-bin\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116329 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-ovnkube-config\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116357 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jffsr\" (UniqueName: \"kubernetes.io/projected/acca45df-62e2-4002-8d37-055685b49029-kube-api-access-jffsr\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116362 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-kubelet\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116390 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-etc-selinux\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116417 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-sys-fs\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116441 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-var-lib-kubelet\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116467 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-os-release\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.116527 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116496 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116548 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-k8s-cni-cncf-io\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116611 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-daemon-config\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116641 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-os-release\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116673 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-socket-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116702 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-cni-multus\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116743 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-cni-netd\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116768 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116795 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67hbw\" (UniqueName: \"kubernetes.io/projected/c19cc309-d892-45ed-a3cd-43a98273bafb-kube-api-access-67hbw\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116837 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116867 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7502ad9c-0942-4f13-92a5-5c98853da696-konnectivity-ca\") pod \"konnectivity-agent-j5w9d\" (UID: \"7502ad9c-0942-4f13-92a5-5c98853da696\") " pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116894 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-host\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116920 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-systemd-units\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116946 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d1e9be3-c77c-4335-aa84-6e1675c140a1-host-slash\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.116970 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-cnibin\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117008 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-kubelet\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.117026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117026 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7lws\" (UniqueName: \"kubernetes.io/projected/f7494b64-7d53-401b-8d5f-fbe58b9bf342-kube-api-access-v7lws\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117043 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-ovn\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117058 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/527aa17b-dc79-48d9-ab47-acf333ccde3f-tmp\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117090 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-slash\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117105 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-var-lib-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117103 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117121 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysctl-d\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117137 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-etc-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117150 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-systemd-units\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117151 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxv5h\" (UniqueName: \"kubernetes.io/projected/59ba8b67-3c2d-436f-beec-62d19349a64d-kube-api-access-rxv5h\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117179 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb372715-ce4d-476e-881a-eedf339ac388-serviceca\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117196 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-sys\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117211 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-lib-modules\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117254 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-hostroot\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117272 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117288 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7502ad9c-0942-4f13-92a5-5c98853da696-agent-certs\") pod \"konnectivity-agent-j5w9d\" (UID: \"7502ad9c-0942-4f13-92a5-5c98853da696\") " pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117305 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-conf-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.117515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117323 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-system-cni-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117338 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d1e9be3-c77c-4335-aa84-6e1675c140a1-host-slash\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117336 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-ovnkube-config\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117360 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-cni-bin\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117340 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-cni-bin\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117374 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-cnibin\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117397 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117401 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-tmp-dir\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117402 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-cni-multus\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117434 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-multus-certs\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117445 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-var-lib-kubelet\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117467 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-systemd\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117476 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-cni-netd\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117496 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-log-socket\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117526 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-device-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117553 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-ovn\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117553 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-kubernetes\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117591 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-run\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.118096 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117641 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-host-run-multus-certs\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117653 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-slash\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117687 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-var-lib-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117720 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-etc-openvswitch\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117716 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117740 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117752 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-socket-dir-parent\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117754 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-hostroot\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117769 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-run-netns\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117788 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-env-overrides\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117821 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-ovnkube-script-lib\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117840 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-cnibin\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117860 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-cni-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117917 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7494b64-7d53-401b-8d5f-fbe58b9bf342-cni-binary-copy\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117934 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-node-log\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117946 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-tmp-dir\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117950 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.118881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117971 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-modprobe-d\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.117994 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-log-socket\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118023 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-cni-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118023 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysctl-conf\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118051 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-systemd\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118058 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-host-run-netns\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118044 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-run-systemd\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118083 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-conf-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118172 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-cnibin\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118173 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-system-cni-dir\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118256 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7494b64-7d53-401b-8d5f-fbe58b9bf342-multus-socket-dir-parent\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118304 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-tuned\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118370 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndw5l\" (UniqueName: \"kubernetes.io/projected/3d1e9be3-c77c-4335-aa84-6e1675c140a1-kube-api-access-ndw5l\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118398 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-system-cni-dir\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118439 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-registration-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118506 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysconfig\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118586 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acca45df-62e2-4002-8d37-055685b49029-system-cni-dir\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.119464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118660 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59ba8b67-3c2d-436f-beec-62d19349a64d-node-log\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.120060 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118689 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-ovnkube-script-lib\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.120060 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118705 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7494b64-7d53-401b-8d5f-fbe58b9bf342-cni-binary-copy\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.120060 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118724 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.120060 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118742 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/acca45df-62e2-4002-8d37-055685b49029-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.120060 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.118966 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb372715-ce4d-476e-881a-eedf339ac388-serviceca\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.120060 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.119022 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59ba8b67-3c2d-436f-beec-62d19349a64d-env-overrides\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.120060 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.119645 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59ba8b67-3c2d-436f-beec-62d19349a64d-ovn-node-metrics-cert\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.132067 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.132042 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:27.132182 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.132091 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:27.132182 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.132107 2582 projected.go:194] Error preparing data for projected volume kube-api-access-fd2h4 for pod openshift-network-diagnostics/network-check-target-k24sc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:27.132182 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.132173 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4 podName:df25b403-ced4-4c31-9691-1da44a52f2a0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:27.632155006 +0000 UTC m=+3.142682204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2h4" (UniqueName: "kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4") pod "network-check-target-k24sc" (UID: "df25b403-ced4-4c31-9691-1da44a52f2a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:27.134277 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.134249 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jffsr\" (UniqueName: \"kubernetes.io/projected/acca45df-62e2-4002-8d37-055685b49029-kube-api-access-jffsr\") pod \"multus-additional-cni-plugins-n5rkg\" (UID: \"acca45df-62e2-4002-8d37-055685b49029\") " pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.134420 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.134397 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzdc\" (UniqueName: \"kubernetes.io/projected/fb372715-ce4d-476e-881a-eedf339ac388-kube-api-access-tgzdc\") pod \"node-ca-sw5hp\" (UID: \"fb372715-ce4d-476e-881a-eedf339ac388\") " pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.137003 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.136978 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hbw\" (UniqueName: \"kubernetes.io/projected/c19cc309-d892-45ed-a3cd-43a98273bafb-kube-api-access-67hbw\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:27.137225 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.137201 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7lws\" (UniqueName: \"kubernetes.io/projected/f7494b64-7d53-401b-8d5f-fbe58b9bf342-kube-api-access-v7lws\") pod \"multus-d8kzk\" (UID: \"f7494b64-7d53-401b-8d5f-fbe58b9bf342\") " pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.137357 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.137335 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndw5l\" (UniqueName: \"kubernetes.io/projected/3d1e9be3-c77c-4335-aa84-6e1675c140a1-kube-api-access-ndw5l\") pod \"iptables-alerter-b6zc7\" (UID: \"3d1e9be3-c77c-4335-aa84-6e1675c140a1\") " pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.137426 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.137359 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg782\" (UniqueName: \"kubernetes.io/projected/e6d48c3e-9be8-4750-ab9c-18ee060b61dd-kube-api-access-fg782\") pod \"node-resolver-fq7vn\" (UID: \"e6d48c3e-9be8-4750-ab9c-18ee060b61dd\") " pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.138474 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.138452 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxv5h\" (UniqueName: \"kubernetes.io/projected/59ba8b67-3c2d-436f-beec-62d19349a64d-kube-api-access-rxv5h\") pod \"ovnkube-node-4ntzt\" (UID: \"59ba8b67-3c2d-436f-beec-62d19349a64d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.219247 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219216 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.219247 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219250 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-socket-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.219465 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219338 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.219465 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219366 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-socket-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.219465 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219385 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7502ad9c-0942-4f13-92a5-5c98853da696-konnectivity-ca\") pod \"konnectivity-agent-j5w9d\" (UID: \"7502ad9c-0942-4f13-92a5-5c98853da696\") " pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:27.219465 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219419 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-host\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219585 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219470 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-host\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219585 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219491 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/527aa17b-dc79-48d9-ab47-acf333ccde3f-tmp\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219585 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219508 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysctl-d\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219585 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219525 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-sys\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219585 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219539 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-lib-modules\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219585 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219555 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7502ad9c-0942-4f13-92a5-5c98853da696-agent-certs\") pod \"konnectivity-agent-j5w9d\" (UID: \"7502ad9c-0942-4f13-92a5-5c98853da696\") " pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:27.219585 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219576 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-device-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219596 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-kubernetes\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219617 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-run\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219650 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-modprobe-d\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219660 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-sys\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219672 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysctl-conf\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219688 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysctl-d\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219695 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-systemd\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219733 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-tuned\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219743 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-kubernetes\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219758 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-modprobe-d\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219766 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-registration-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219793 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysconfig\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219864 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-registration-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219823 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-systemd\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219689 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-lib-modules\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219827 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-device-dir\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.219959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219822 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysctl-conf\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219907 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t68xt\" (UniqueName: \"kubernetes.io/projected/527aa17b-dc79-48d9-ab47-acf333ccde3f-kube-api-access-t68xt\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219942 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc95n\" (UniqueName: \"kubernetes.io/projected/0e3a9e21-81c9-47c3-8145-7d69cfec4599-kube-api-access-pc95n\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219963 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7502ad9c-0942-4f13-92a5-5c98853da696-konnectivity-ca\") pod \"konnectivity-agent-j5w9d\" (UID: \"7502ad9c-0942-4f13-92a5-5c98853da696\") " pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219982 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-etc-selinux\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.220006 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-sys-fs\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.220030 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-var-lib-kubelet\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.220042 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-run\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.220101 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-var-lib-kubelet\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.220109 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-etc-selinux\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.219941 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-sysconfig\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.220593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.220150 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0e3a9e21-81c9-47c3-8145-7d69cfec4599-sys-fs\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.221913 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.221895 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/527aa17b-dc79-48d9-ab47-acf333ccde3f-etc-tuned\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.222501 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.222479 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/527aa17b-dc79-48d9-ab47-acf333ccde3f-tmp\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.222709 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.222688 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7502ad9c-0942-4f13-92a5-5c98853da696-agent-certs\") pod \"konnectivity-agent-j5w9d\" (UID: \"7502ad9c-0942-4f13-92a5-5c98853da696\") " pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:27.228134 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.228112 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc95n\" (UniqueName: \"kubernetes.io/projected/0e3a9e21-81c9-47c3-8145-7d69cfec4599-kube-api-access-pc95n\") pod \"aws-ebs-csi-driver-node-hnxqc\" (UID: \"0e3a9e21-81c9-47c3-8145-7d69cfec4599\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.228460 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.228440 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t68xt\" (UniqueName: \"kubernetes.io/projected/527aa17b-dc79-48d9-ab47-acf333ccde3f-kube-api-access-t68xt\") pod \"tuned-vcbqm\" (UID: \"527aa17b-dc79-48d9-ab47-acf333ccde3f\") " pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.294465 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.294386 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sw5hp" Apr 24 22:29:27.312359 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.312333 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b6zc7" Apr 24 22:29:27.323261 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.323237 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fq7vn" Apr 24 22:29:27.330827 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.330792 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d8kzk" Apr 24 22:29:27.339346 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.339321 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" Apr 24 22:29:27.345851 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.345832 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:27.355513 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.355492 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:27.361150 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.361131 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" Apr 24 22:29:27.366778 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.366758 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" Apr 24 22:29:27.622744 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.622660 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:27.622915 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.622836 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:27.622915 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.622909 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs podName:c19cc309-d892-45ed-a3cd-43a98273bafb nodeName:}" failed. No retries permitted until 2026-04-24 22:29:28.622892483 +0000 UTC m=+4.133419680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs") pod "network-metrics-daemon-44r7l" (UID: "c19cc309-d892-45ed-a3cd-43a98273bafb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:27.723370 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:27.723336 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:27.723545 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.723487 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:27.723545 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.723505 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:27.723545 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.723517 2582 projected.go:194] Error preparing data for projected volume kube-api-access-fd2h4 for pod openshift-network-diagnostics/network-check-target-k24sc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:27.723709 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:27.723582 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4 podName:df25b403-ced4-4c31-9691-1da44a52f2a0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:28.723563048 +0000 UTC m=+4.234090240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2h4" (UniqueName: "kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4") pod "network-check-target-k24sc" (UID: "df25b403-ced4-4c31-9691-1da44a52f2a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:27.761823 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.761112 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e3a9e21_81c9_47c3_8145_7d69cfec4599.slice/crio-99be8c8cbf3ef571be1c1735f25c6a397a92fe5dc2699b090b9a1e0e6d26f0e5 WatchSource:0}: Error finding container 99be8c8cbf3ef571be1c1735f25c6a397a92fe5dc2699b090b9a1e0e6d26f0e5: Status 404 returned error can't find the container with id 99be8c8cbf3ef571be1c1735f25c6a397a92fe5dc2699b090b9a1e0e6d26f0e5 Apr 24 22:29:27.765601 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.765572 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d48c3e_9be8_4750_ab9c_18ee060b61dd.slice/crio-c99686762b5bfa131a1671aee31b3f9fa84e05c9e49f64b68593049f8e2889e7 WatchSource:0}: Error finding container c99686762b5bfa131a1671aee31b3f9fa84e05c9e49f64b68593049f8e2889e7: Status 404 returned error can't find the container with id c99686762b5bfa131a1671aee31b3f9fa84e05c9e49f64b68593049f8e2889e7 Apr 24 22:29:27.766833 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.766726 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ba8b67_3c2d_436f_beec_62d19349a64d.slice/crio-b830368cd83dc5c6a31e3b3f632ba19aa65a0bdeb510d2d06cfbddc9be247134 WatchSource:0}: Error finding container b830368cd83dc5c6a31e3b3f632ba19aa65a0bdeb510d2d06cfbddc9be247134: Status 404 returned error can't find the container with id b830368cd83dc5c6a31e3b3f632ba19aa65a0bdeb510d2d06cfbddc9be247134 Apr 24 22:29:27.768243 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.768222 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7494b64_7d53_401b_8d5f_fbe58b9bf342.slice/crio-d3af8a1add5e41f81be43c10aa948335a0e871c179c6c90e14e85cbb85afab9f WatchSource:0}: Error finding container d3af8a1add5e41f81be43c10aa948335a0e871c179c6c90e14e85cbb85afab9f: Status 404 returned error can't find the container with id d3af8a1add5e41f81be43c10aa948335a0e871c179c6c90e14e85cbb85afab9f Apr 24 22:29:27.790799 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.790768 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacca45df_62e2_4002_8d37_055685b49029.slice/crio-dd740fc1cdd810f42985e457b16f5dbadc0212e9ed6e4898b6740c5fbce79cb4 WatchSource:0}: Error finding container dd740fc1cdd810f42985e457b16f5dbadc0212e9ed6e4898b6740c5fbce79cb4: Status 404 returned error can't find the container with id dd740fc1cdd810f42985e457b16f5dbadc0212e9ed6e4898b6740c5fbce79cb4 Apr 24 22:29:27.791476 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.791454 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod527aa17b_dc79_48d9_ab47_acf333ccde3f.slice/crio-337fc155d181ad1777dc9c6238902d6f8fd1ab109d54f35b2cde03672eff76bc WatchSource:0}: Error finding container 337fc155d181ad1777dc9c6238902d6f8fd1ab109d54f35b2cde03672eff76bc: Status 404 returned error can't find the container with id 337fc155d181ad1777dc9c6238902d6f8fd1ab109d54f35b2cde03672eff76bc Apr 24 22:29:27.792472 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.792447 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb372715_ce4d_476e_881a_eedf339ac388.slice/crio-7a825465e233a7de32efb995fee11222f408005334b952c81937af731c96f247 WatchSource:0}: Error finding container 7a825465e233a7de32efb995fee11222f408005334b952c81937af731c96f247: Status 404 returned error can't find the container with id 7a825465e233a7de32efb995fee11222f408005334b952c81937af731c96f247 Apr 24 22:29:27.793200 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.793180 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d1e9be3_c77c_4335_aa84_6e1675c140a1.slice/crio-fe2866a7c6b66f6960bd8c9c774b890b55612c449516253ca09626d66dedab99 WatchSource:0}: Error finding container fe2866a7c6b66f6960bd8c9c774b890b55612c449516253ca09626d66dedab99: Status 404 returned error can't find the container with id fe2866a7c6b66f6960bd8c9c774b890b55612c449516253ca09626d66dedab99 Apr 24 22:29:27.794144 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:29:27.794112 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7502ad9c_0942_4f13_92a5_5c98853da696.slice/crio-56bee914f1b310a038647c54690aec0f6e65978179ab25c30c9164ea1fb49a63 WatchSource:0}: Error finding container 56bee914f1b310a038647c54690aec0f6e65978179ab25c30c9164ea1fb49a63: Status 404 returned error can't find the container with id 56bee914f1b310a038647c54690aec0f6e65978179ab25c30c9164ea1fb49a63 Apr 24 22:29:28.033451 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.033407 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:26 +0000 UTC" deadline="2028-02-04 02:01:51.735807788 +0000 UTC" Apr 24 22:29:28.033451 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.033443 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15603h32m23.702367581s" Apr 24 22:29:28.111995 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.111364 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" event={"ID":"d74e5b7f3ca859862ed6413284694748","Type":"ContainerStarted","Data":"e9bbb58e230eb314d4a3606fa6bf2ff12334f7a68132d38b10bb006fff306639"} Apr 24 22:29:28.113337 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.113302 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j5w9d" event={"ID":"7502ad9c-0942-4f13-92a5-5c98853da696","Type":"ContainerStarted","Data":"56bee914f1b310a038647c54690aec0f6e65978179ab25c30c9164ea1fb49a63"} Apr 24 22:29:28.114343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.114310 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b6zc7" event={"ID":"3d1e9be3-c77c-4335-aa84-6e1675c140a1","Type":"ContainerStarted","Data":"fe2866a7c6b66f6960bd8c9c774b890b55612c449516253ca09626d66dedab99"} Apr 24 22:29:28.116101 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.116051 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d8kzk" event={"ID":"f7494b64-7d53-401b-8d5f-fbe58b9bf342","Type":"ContainerStarted","Data":"d3af8a1add5e41f81be43c10aa948335a0e871c179c6c90e14e85cbb85afab9f"} Apr 24 22:29:28.117491 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.117455 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"b830368cd83dc5c6a31e3b3f632ba19aa65a0bdeb510d2d06cfbddc9be247134"} Apr 24 22:29:28.119391 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.118645 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fq7vn" event={"ID":"e6d48c3e-9be8-4750-ab9c-18ee060b61dd","Type":"ContainerStarted","Data":"c99686762b5bfa131a1671aee31b3f9fa84e05c9e49f64b68593049f8e2889e7"} Apr 24 22:29:28.119755 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.119735 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" event={"ID":"0e3a9e21-81c9-47c3-8145-7d69cfec4599","Type":"ContainerStarted","Data":"99be8c8cbf3ef571be1c1735f25c6a397a92fe5dc2699b090b9a1e0e6d26f0e5"} Apr 24 22:29:28.121049 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.121023 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sw5hp" event={"ID":"fb372715-ce4d-476e-881a-eedf339ac388","Type":"ContainerStarted","Data":"7a825465e233a7de32efb995fee11222f408005334b952c81937af731c96f247"} Apr 24 22:29:28.122289 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.122263 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" event={"ID":"527aa17b-dc79-48d9-ab47-acf333ccde3f","Type":"ContainerStarted","Data":"337fc155d181ad1777dc9c6238902d6f8fd1ab109d54f35b2cde03672eff76bc"} Apr 24 22:29:28.123464 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.123433 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" event={"ID":"acca45df-62e2-4002-8d37-055685b49029","Type":"ContainerStarted","Data":"dd740fc1cdd810f42985e457b16f5dbadc0212e9ed6e4898b6740c5fbce79cb4"} Apr 24 22:29:28.126012 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.125498 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" podStartSLOduration=2.125483328 podStartE2EDuration="2.125483328s" podCreationTimestamp="2026-04-24 22:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:28.125016272 +0000 UTC m=+3.635543472" watchObservedRunningTime="2026-04-24 22:29:28.125483328 +0000 UTC m=+3.636010529" Apr 24 22:29:28.631245 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.631203 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:28.631428 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:28.631375 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:28.631487 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:28.631445 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs podName:c19cc309-d892-45ed-a3cd-43a98273bafb nodeName:}" failed. No retries permitted until 2026-04-24 22:29:30.631424159 +0000 UTC m=+6.141951343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs") pod "network-metrics-daemon-44r7l" (UID: "c19cc309-d892-45ed-a3cd-43a98273bafb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:28.731614 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:28.731575 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:28.731765 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:28.731739 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:28.731765 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:28.731758 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:28.731900 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:28.731771 2582 projected.go:194] Error preparing data for projected volume kube-api-access-fd2h4 for pod openshift-network-diagnostics/network-check-target-k24sc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:28.731900 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:28.731847 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4 podName:df25b403-ced4-4c31-9691-1da44a52f2a0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:30.731828943 +0000 UTC m=+6.242356131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2h4" (UniqueName: "kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4") pod "network-check-target-k24sc" (UID: "df25b403-ced4-4c31-9691-1da44a52f2a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:29.102018 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:29.101991 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:29.102388 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:29.102129 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:29.102388 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:29.102135 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:29.102388 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:29.102220 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:30.155306 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:30.154516 2582 generic.go:358] "Generic (PLEG): container finished" podID="48fe24b0efea27e3f5eca2d71913b3e7" containerID="54f1855ebd6f61b313508888fce664779e4475219071ee18fe50f0191c0f0aef" exitCode=0 Apr 24 22:29:30.155306 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:30.154570 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" event={"ID":"48fe24b0efea27e3f5eca2d71913b3e7","Type":"ContainerDied","Data":"54f1855ebd6f61b313508888fce664779e4475219071ee18fe50f0191c0f0aef"} Apr 24 22:29:30.646149 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:30.645514 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:30.646149 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:30.645680 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:30.646149 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:30.645741 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs podName:c19cc309-d892-45ed-a3cd-43a98273bafb nodeName:}" failed. No retries permitted until 2026-04-24 22:29:34.645723757 +0000 UTC m=+10.156250950 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs") pod "network-metrics-daemon-44r7l" (UID: "c19cc309-d892-45ed-a3cd-43a98273bafb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:30.746819 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:30.746150 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:30.746819 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:30.746361 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:30.746819 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:30.746382 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:30.746819 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:30.746396 2582 projected.go:194] Error preparing data for projected volume kube-api-access-fd2h4 for pod openshift-network-diagnostics/network-check-target-k24sc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:30.746819 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:30.746452 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4 podName:df25b403-ced4-4c31-9691-1da44a52f2a0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:34.746434377 +0000 UTC m=+10.256961558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2h4" (UniqueName: "kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4") pod "network-check-target-k24sc" (UID: "df25b403-ced4-4c31-9691-1da44a52f2a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:31.101035 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:31.100950 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:31.101197 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:31.101094 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:31.101480 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:31.101451 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:31.101602 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:31.101550 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:33.101014 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:33.100972 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:33.101470 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:33.101106 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:33.101534 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:33.100972 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:33.101676 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:33.101626 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:34.682199 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:34.682161 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:34.682549 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:34.682347 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:34.682549 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:34.682412 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs podName:c19cc309-d892-45ed-a3cd-43a98273bafb nodeName:}" failed. No retries permitted until 2026-04-24 22:29:42.6823911 +0000 UTC m=+18.192918283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs") pod "network-metrics-daemon-44r7l" (UID: "c19cc309-d892-45ed-a3cd-43a98273bafb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:34.782729 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:34.782687 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:34.782938 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:34.782876 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:34.782938 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:34.782899 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:34.782938 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:34.782911 2582 projected.go:194] Error preparing data for projected volume kube-api-access-fd2h4 for pod openshift-network-diagnostics/network-check-target-k24sc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:34.783096 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:34.782973 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4 podName:df25b403-ced4-4c31-9691-1da44a52f2a0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:42.782951662 +0000 UTC m=+18.293478842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2h4" (UniqueName: "kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4") pod "network-check-target-k24sc" (UID: "df25b403-ced4-4c31-9691-1da44a52f2a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:35.103033 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:35.102944 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:35.103033 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:35.102986 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:35.103242 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:35.103092 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:35.103242 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:35.103202 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:37.100064 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:37.100026 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:37.100512 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:37.100074 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:37.100512 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:37.100176 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:37.100512 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:37.100322 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:39.100174 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:39.100139 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:39.100631 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:39.100269 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:39.100631 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:39.100305 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:39.100631 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:39.100408 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:40.818051 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:40.818015 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-k6bhr"] Apr 24 22:29:40.833736 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:40.833712 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:40.833882 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:40.833783 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:40.926893 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:40.926783 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-kubelet-config\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:40.927034 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:40.926927 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-dbus\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:40.927034 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:40.926967 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:41.028306 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:41.028270 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-kubelet-config\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:41.028479 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:41.028342 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-dbus\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:41.028479 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:41.028365 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:41.028479 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:41.028418 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-kubelet-config\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:41.028479 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:41.028473 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:41.028678 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:41.028536 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret podName:ae28b2f3-d733-438a-8a82-1ea82ac5ac63 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:41.528515717 +0000 UTC m=+17.039042909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret") pod "global-pull-secret-syncer-k6bhr" (UID: "ae28b2f3-d733-438a-8a82-1ea82ac5ac63") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:41.028678 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:41.028565 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-dbus\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:41.099529 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:41.099495 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:41.099704 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:41.099620 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:41.099704 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:41.099687 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:41.099904 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:41.099880 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:41.532738 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:41.532701 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:41.532913 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:41.532847 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:41.532913 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:41.532905 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret podName:ae28b2f3-d733-438a-8a82-1ea82ac5ac63 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:42.532891521 +0000 UTC m=+18.043418698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret") pod "global-pull-secret-syncer-k6bhr" (UID: "ae28b2f3-d733-438a-8a82-1ea82ac5ac63") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:42.539549 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:42.539501 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:42.539982 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:42.539652 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:42.539982 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:42.539731 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret podName:ae28b2f3-d733-438a-8a82-1ea82ac5ac63 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:44.539708722 +0000 UTC m=+20.050235914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret") pod "global-pull-secret-syncer-k6bhr" (UID: "ae28b2f3-d733-438a-8a82-1ea82ac5ac63") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:42.741392 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:42.741360 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:42.741554 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:42.741514 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:42.741613 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:42.741579 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs podName:c19cc309-d892-45ed-a3cd-43a98273bafb nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.741564028 +0000 UTC m=+34.252091211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs") pod "network-metrics-daemon-44r7l" (UID: "c19cc309-d892-45ed-a3cd-43a98273bafb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:42.842571 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:42.842475 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:42.842717 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:42.842676 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:42.842717 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:42.842703 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:42.842828 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:42.842719 2582 projected.go:194] Error preparing data for projected volume kube-api-access-fd2h4 for pod openshift-network-diagnostics/network-check-target-k24sc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:42.842828 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:42.842785 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4 podName:df25b403-ced4-4c31-9691-1da44a52f2a0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.842766116 +0000 UTC m=+34.353293297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2h4" (UniqueName: "kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4") pod "network-check-target-k24sc" (UID: "df25b403-ced4-4c31-9691-1da44a52f2a0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:43.100265 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:43.100183 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:43.100429 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:43.100183 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:43.100429 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:43.100301 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:43.100429 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:43.100190 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:43.100429 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:43.100409 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:43.100606 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:43.100522 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:44.553395 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:44.553358 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:44.553772 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:44.553521 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:44.553772 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:44.553590 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret podName:ae28b2f3-d733-438a-8a82-1ea82ac5ac63 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:48.553574479 +0000 UTC m=+24.064101657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret") pod "global-pull-secret-syncer-k6bhr" (UID: "ae28b2f3-d733-438a-8a82-1ea82ac5ac63") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:45.101061 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:45.101027 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:45.101231 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:45.101132 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:45.101231 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:45.101219 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:45.101369 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:45.101343 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:45.101423 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:45.101389 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:45.101468 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:45.101438 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:46.185047 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.184705 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" event={"ID":"48fe24b0efea27e3f5eca2d71913b3e7","Type":"ContainerStarted","Data":"c8c3c475c5ee6b3689876bfe12ca0e34636f7eb7fcd608441934b928c951e48c"} Apr 24 22:29:46.186382 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.186351 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sw5hp" event={"ID":"fb372715-ce4d-476e-881a-eedf339ac388","Type":"ContainerStarted","Data":"11dd195011c189a108bc539fc7b7f6fe50502c22ef56987df717360019eb2362"} Apr 24 22:29:46.188243 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.188211 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" event={"ID":"527aa17b-dc79-48d9-ab47-acf333ccde3f","Type":"ContainerStarted","Data":"2d2178ce487e71e207c162f1cbbe046b4abfb9e94bf93be3f57c74707ba1b817"} Apr 24 22:29:46.190030 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.190003 2582 generic.go:358] "Generic (PLEG): container finished" podID="acca45df-62e2-4002-8d37-055685b49029" containerID="b7d725078618a0c8adf7ce055a3dbc983d994f533bd3dea7abc515ed958bbd1e" exitCode=0 Apr 24 22:29:46.190145 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.190085 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" event={"ID":"acca45df-62e2-4002-8d37-055685b49029","Type":"ContainerDied","Data":"b7d725078618a0c8adf7ce055a3dbc983d994f533bd3dea7abc515ed958bbd1e"} Apr 24 22:29:46.191440 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.191414 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j5w9d" event={"ID":"7502ad9c-0942-4f13-92a5-5c98853da696","Type":"ContainerStarted","Data":"26684ef776e87d328b881c07eb2114a02b91c5b1b3c49daa0bdceee3ed18fe7b"} Apr 24 22:29:46.192760 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.192734 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d8kzk" event={"ID":"f7494b64-7d53-401b-8d5f-fbe58b9bf342","Type":"ContainerStarted","Data":"0048948790b98f96013d21fc6f23efabfa4ab8f9a822fb80bbb5b6fa0b6f89af"} Apr 24 22:29:46.195188 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.194994 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"01b7a1b5b6fee27e73d65e245635b7483442002724d982bb2f6e4c8838b82876"} Apr 24 22:29:46.195188 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.195025 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"c925894dc26a9da844805433f5fe4d226cc9873166d0631448827461b6eb8451"} Apr 24 22:29:46.196485 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.196456 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fq7vn" event={"ID":"e6d48c3e-9be8-4750-ab9c-18ee060b61dd","Type":"ContainerStarted","Data":"42c143cdf3ce9c666d1b7e709692ccb0b6b7e70c2a16014f50961323d707536f"} Apr 24 22:29:46.197970 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.197946 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" event={"ID":"0e3a9e21-81c9-47c3-8145-7d69cfec4599","Type":"ContainerStarted","Data":"2f5b60decb0d1f1c53fdb5b52772f0358b7f8660941999fb3d3d4052031097dc"} Apr 24 22:29:46.200202 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.200148 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" podStartSLOduration=20.200132792 podStartE2EDuration="20.200132792s" podCreationTimestamp="2026-04-24 22:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:46.199800321 +0000 UTC m=+21.710327520" watchObservedRunningTime="2026-04-24 22:29:46.200132792 +0000 UTC m=+21.710659993" Apr 24 22:29:46.214005 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.213961 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d8kzk" podStartSLOduration=3.663579816 podStartE2EDuration="21.213947318s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.789387861 +0000 UTC m=+3.299915039" lastFinishedPulling="2026-04-24 22:29:45.339755353 +0000 UTC m=+20.850282541" observedRunningTime="2026-04-24 22:29:46.213515909 +0000 UTC m=+21.724043109" watchObservedRunningTime="2026-04-24 22:29:46.213947318 +0000 UTC m=+21.724474517" Apr 24 22:29:46.226202 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.226157 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j5w9d" podStartSLOduration=3.7706607070000002 podStartE2EDuration="21.226143726s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.797169146 +0000 UTC m=+3.307696328" lastFinishedPulling="2026-04-24 22:29:45.25265215 +0000 UTC m=+20.763179347" observedRunningTime="2026-04-24 22:29:46.225682295 +0000 UTC m=+21.736209495" watchObservedRunningTime="2026-04-24 22:29:46.226143726 +0000 UTC m=+21.736670964" Apr 24 22:29:46.240593 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.240546 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vcbqm" podStartSLOduration=3.76375932 podStartE2EDuration="21.240529782s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.797407319 +0000 UTC m=+3.307934501" lastFinishedPulling="2026-04-24 22:29:45.274177771 +0000 UTC m=+20.784704963" observedRunningTime="2026-04-24 22:29:46.240154047 +0000 UTC m=+21.750681247" watchObservedRunningTime="2026-04-24 22:29:46.240529782 +0000 UTC m=+21.751056983" Apr 24 22:29:46.252849 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.252789 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fq7vn" podStartSLOduration=3.767699696 podStartE2EDuration="21.252777208s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.789358856 +0000 UTC m=+3.299886035" lastFinishedPulling="2026-04-24 22:29:45.274436353 +0000 UTC m=+20.784963547" observedRunningTime="2026-04-24 22:29:46.252370855 +0000 UTC m=+21.762898055" watchObservedRunningTime="2026-04-24 22:29:46.252777208 +0000 UTC m=+21.763304398" Apr 24 22:29:46.289702 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.289660 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sw5hp" podStartSLOduration=3.8127786500000003 podStartE2EDuration="21.289647524s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.797131695 +0000 UTC m=+3.307658877" lastFinishedPulling="2026-04-24 22:29:45.274000566 +0000 UTC m=+20.784527751" observedRunningTime="2026-04-24 22:29:46.289228243 +0000 UTC m=+21.799755442" watchObservedRunningTime="2026-04-24 22:29:46.289647524 +0000 UTC m=+21.800174724" Apr 24 22:29:46.402485 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:46.402463 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:29:47.070419 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.070069 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:29:46.402480194Z","UUID":"8357530c-6d4e-4871-b2f2-6bf5ce3138a3","Handler":null,"Name":"","Endpoint":""} Apr 24 22:29:47.073400 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.073372 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:29:47.073544 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.073408 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:29:47.099761 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.099733 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:47.099940 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:47.099869 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:47.100030 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.100006 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:47.100136 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:47.100114 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:47.100331 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.100315 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:47.100435 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:47.100401 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:47.202323 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.202277 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" event={"ID":"0e3a9e21-81c9-47c3-8145-7d69cfec4599","Type":"ContainerStarted","Data":"256e8d92d9587f97b11363309a9235e0cd9bc547728c78f005bbe66ef0547ba1"} Apr 24 22:29:47.203941 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.203793 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b6zc7" event={"ID":"3d1e9be3-c77c-4335-aa84-6e1675c140a1","Type":"ContainerStarted","Data":"3678b75653c0b4f6554a6a4e3c317a4f31a6b54da2186b3ab68387c00bb1b63c"} Apr 24 22:29:47.206870 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.206841 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"975476097e0d26f82c55815a055153c06e3a90128a5712ec24e6cdbbee7051a2"} Apr 24 22:29:47.206981 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.206881 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"cb3acbd07ed6c99a8b357f77806629a2071e20a60de055b171fa73ed73ee6141"} Apr 24 22:29:47.206981 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.206896 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"2ffa9cddc25b5005fe2ac2621d10e4b7be6ad38e3d19e675cf6b7ab182a268ab"} Apr 24 22:29:47.206981 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.206909 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"1cd07d9884e73a08cb289ad8d34e3d232735af19f6b16d106012fe19e8a64ad2"} Apr 24 22:29:47.231570 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:47.231524 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-b6zc7" podStartSLOduration=4.752208863 podStartE2EDuration="22.231512265s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.797472338 +0000 UTC m=+3.307999521" lastFinishedPulling="2026-04-24 22:29:45.276775731 +0000 UTC m=+20.787302923" observedRunningTime="2026-04-24 22:29:47.231214816 +0000 UTC m=+22.741742018" watchObservedRunningTime="2026-04-24 22:29:47.231512265 +0000 UTC m=+22.742039464" Apr 24 22:29:48.210219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:48.210173 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" event={"ID":"0e3a9e21-81c9-47c3-8145-7d69cfec4599","Type":"ContainerStarted","Data":"af6873016eabda711d0f690adf72efdf215fbc0c032b84c0ee4275d1beefda93"} Apr 24 22:29:48.230295 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:48.230245 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hnxqc" podStartSLOduration=3.815295577 podStartE2EDuration="23.230227627s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.762980508 +0000 UTC m=+3.273507689" lastFinishedPulling="2026-04-24 22:29:47.177912557 +0000 UTC m=+22.688439739" observedRunningTime="2026-04-24 22:29:48.230223488 +0000 UTC m=+23.740750688" watchObservedRunningTime="2026-04-24 22:29:48.230227627 +0000 UTC m=+23.740754827" Apr 24 22:29:48.585781 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:48.585757 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:48.585919 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:48.585895 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:48.585966 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:48.585954 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret podName:ae28b2f3-d733-438a-8a82-1ea82ac5ac63 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:56.585936736 +0000 UTC m=+32.096463921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret") pod "global-pull-secret-syncer-k6bhr" (UID: "ae28b2f3-d733-438a-8a82-1ea82ac5ac63") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:49.100003 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:49.099969 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:49.100182 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:49.100017 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:49.100182 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:49.100095 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:49.100182 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:49.100167 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:49.100305 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:49.100218 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:49.100305 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:49.100290 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:49.215639 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:49.215601 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"8cc4fbb4ba89f6082f8dd56c6b69ba505abf13f7244b56c1b41d7418c59d1e74"} Apr 24 22:29:49.852027 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:49.851989 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:51.100169 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.099907 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:51.100169 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.099907 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:51.100855 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:51.100182 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:51.100855 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.099907 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:51.100855 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:51.100236 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:51.100855 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:51.100310 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:51.155137 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.155107 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:51.155686 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.155669 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:51.222273 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.222241 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" event={"ID":"59ba8b67-3c2d-436f-beec-62d19349a64d","Type":"ContainerStarted","Data":"276cdbb3fd95f100ebb7de8f2f4cbc54dcda97325d4a885ee4d0df7c50a88e15"} Apr 24 22:29:51.222567 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.222549 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:51.222706 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.222576 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:51.223966 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.223858 2582 generic.go:358] "Generic (PLEG): container finished" podID="acca45df-62e2-4002-8d37-055685b49029" containerID="acb2f6533ac97d44afa505a62f34a5a50a5dc50acf079b63aacd687055a2033c" exitCode=0 Apr 24 22:29:51.223966 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.223940 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" event={"ID":"acca45df-62e2-4002-8d37-055685b49029","Type":"ContainerDied","Data":"acb2f6533ac97d44afa505a62f34a5a50a5dc50acf079b63aacd687055a2033c"} Apr 24 22:29:51.224631 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.224556 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j5w9d" Apr 24 22:29:51.238607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.238588 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:51.255935 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:51.255891 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" podStartSLOduration=8.249628134 podStartE2EDuration="26.255876685s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.789383897 +0000 UTC m=+3.299911075" lastFinishedPulling="2026-04-24 22:29:45.795632444 +0000 UTC m=+21.306159626" observedRunningTime="2026-04-24 22:29:51.254311485 +0000 UTC m=+26.764838695" watchObservedRunningTime="2026-04-24 22:29:51.255876685 +0000 UTC m=+26.766403885" Apr 24 22:29:52.226577 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:52.226545 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:52.245967 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:52.245938 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:29:52.727314 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:52.727142 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k6bhr"] Apr 24 22:29:52.727489 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:52.727422 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:52.727555 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:52.727533 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:52.730476 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:52.730412 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k24sc"] Apr 24 22:29:52.730595 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:52.730516 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:52.730650 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:52.730605 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:52.735560 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:52.735535 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-44r7l"] Apr 24 22:29:52.735665 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:52.735652 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:52.735757 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:52.735739 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:53.229529 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:53.229443 2582 generic.go:358] "Generic (PLEG): container finished" podID="acca45df-62e2-4002-8d37-055685b49029" containerID="6b51f3f43339cee74264b8e389d53f53a96add7d66c0ca98c141acdca80093c1" exitCode=0 Apr 24 22:29:53.229878 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:53.229533 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" event={"ID":"acca45df-62e2-4002-8d37-055685b49029","Type":"ContainerDied","Data":"6b51f3f43339cee74264b8e389d53f53a96add7d66c0ca98c141acdca80093c1"} Apr 24 22:29:54.100420 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:54.100388 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:54.100541 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:54.100424 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:54.100541 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:54.100495 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:54.100624 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:54.100605 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:54.233247 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:54.233141 2582 generic.go:358] "Generic (PLEG): container finished" podID="acca45df-62e2-4002-8d37-055685b49029" containerID="d6458d8cb22b2ef15fc6244b3f07ec7ad09e63cecefb3c2221d59e1d4a97aa28" exitCode=0 Apr 24 22:29:54.233587 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:54.233243 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" event={"ID":"acca45df-62e2-4002-8d37-055685b49029","Type":"ContainerDied","Data":"d6458d8cb22b2ef15fc6244b3f07ec7ad09e63cecefb3c2221d59e1d4a97aa28"} Apr 24 22:29:55.100930 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:55.100845 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:55.101063 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:55.100970 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:56.100177 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:56.100145 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:56.100721 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:56.100145 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:56.100721 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:56.100263 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:56.100721 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:56.100331 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:56.651080 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:56.651039 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:56.651243 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:56.651216 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:56.651299 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:56.651288 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret podName:ae28b2f3-d733-438a-8a82-1ea82ac5ac63 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.651273067 +0000 UTC m=+48.161800244 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret") pod "global-pull-secret-syncer-k6bhr" (UID: "ae28b2f3-d733-438a-8a82-1ea82ac5ac63") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:57.100053 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:57.099972 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:57.100202 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:57.100117 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44r7l" podUID="c19cc309-d892-45ed-a3cd-43a98273bafb" Apr 24 22:29:58.100072 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.100036 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:58.100072 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.100036 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:29:58.100641 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.100167 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k24sc" podUID="df25b403-ced4-4c31-9691-1da44a52f2a0" Apr 24 22:29:58.100641 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.100342 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k6bhr" podUID="ae28b2f3-d733-438a-8a82-1ea82ac5ac63" Apr 24 22:29:58.233579 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.233545 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeReady" Apr 24 22:29:58.233822 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.233696 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:29:58.290542 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.290511 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86b6cf9d64-bbh5q"] Apr 24 22:29:58.326498 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.326465 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tbw4p"] Apr 24 22:29:58.326663 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.326616 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.331607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.330750 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:29:58.331607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.331221 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lr488\"" Apr 24 22:29:58.331607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.331280 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:29:58.331607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.331476 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:29:58.341483 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.341458 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-74654b95d8-zp62l"] Apr 24 22:29:58.353721 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.353628 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:29:58.368304 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.368262 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw"] Apr 24 22:29:58.368458 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.368379 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.373766 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.373742 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.373979 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.373828 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 22:29:58.377918 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.377894 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.378222 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.378201 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-czhdc\"" Apr 24 22:29:58.378434 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.378418 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 22:29:58.384296 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.384266 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 22:29:58.386978 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.386958 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-p6xqh"] Apr 24 22:29:58.387083 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.387068 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.387168 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.387149 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.393050 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393005 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 22:29:58.393050 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393029 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 22:29:58.393214 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393084 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-dqtv6\"" Apr 24 22:29:58.393214 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393091 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 22:29:58.393349 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393284 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 22:29:58.393349 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393342 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.393473 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393345 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.393473 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393416 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2xb78\"" Apr 24 22:29:58.393618 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393485 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.393618 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393558 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.393745 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.393641 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 22:29:58.394524 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.394507 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 22:29:58.416989 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.416968 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g"] Apr 24 22:29:58.417095 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.417078 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.419894 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.419875 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 22:29:58.419995 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.419896 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 22:29:58.420040 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.420030 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.420290 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.420270 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.420653 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.420637 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-nr5fz\"" Apr 24 22:29:58.438313 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.438284 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6"] Apr 24 22:29:58.438408 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.438317 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:29:58.441379 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.441358 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 22:29:58.441478 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.441438 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 22:29:58.441478 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.441455 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.441577 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.441518 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.441694 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.441677 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-c2l8h\"" Apr 24 22:29:58.460049 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.460025 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt"] Apr 24 22:29:58.460151 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.460128 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6" Apr 24 22:29:58.462487 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.462470 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-ln8tt\"" Apr 24 22:29:58.462589 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.462491 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.463004 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.462987 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.464445 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464421 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.464522 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464480 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-stats-auth\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.464522 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464505 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3517afbe-450e-4a99-a668-a2cc8ca01cbc-trusted-ca\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.464598 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464521 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.464598 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464538 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-trusted-ca\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.464598 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464564 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-installation-pull-secrets\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.464730 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464602 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3517afbe-450e-4a99-a668-a2cc8ca01cbc-serving-cert\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.464730 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464636 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-image-registry-private-configuration\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.464730 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464659 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d1f9a0b-38db-4713-969a-f7221408b685-ca-trust-extracted\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.464730 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464685 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwj74\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-kube-api-access-fwj74\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.464730 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464714 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.464970 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464735 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.464970 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464759 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.464970 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464837 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-registry-certificates\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.464970 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464867 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3517afbe-450e-4a99-a668-a2cc8ca01cbc-config\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.464970 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464893 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45s2p\" (UniqueName: \"kubernetes.io/projected/3517afbe-450e-4a99-a668-a2cc8ca01cbc-kube-api-access-45s2p\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.464970 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464949 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jnq\" (UniqueName: \"kubernetes.io/projected/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-kube-api-access-k9jnq\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.465206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.464987 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-bound-sa-token\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.465206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.465031 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-default-certificate\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.465206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.465073 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65nc4\" (UniqueName: \"kubernetes.io/projected/a46814a2-9573-4978-a715-70fdad9204e4-kube-api-access-65nc4\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.478434 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.478416 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s"] Apr 24 22:29:58.478613 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.478539 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.481595 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.481560 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 22:29:58.483575 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.482771 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 22:29:58.483575 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.482970 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.483575 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.482980 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.483575 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.483050 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-lm4nh\"" Apr 24 22:29:58.495351 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495329 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tbw4p"] Apr 24 22:29:58.495446 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495362 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86b6cf9d64-bbh5q"] Apr 24 22:29:58.495446 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495380 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-74654b95d8-zp62l"] Apr 24 22:29:58.495446 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495393 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt"] Apr 24 22:29:58.495446 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495404 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g"] Apr 24 22:29:58.495446 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495414 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-p6xqh"] Apr 24 22:29:58.495446 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495425 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s"] Apr 24 22:29:58.495446 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495435 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6"] Apr 24 22:29:58.495446 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495449 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6w7zw"] Apr 24 22:29:58.495800 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.495379 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.499479 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.498876 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.499479 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.499178 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-m58ng\"" Apr 24 22:29:58.499479 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.499255 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.499479 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.499346 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 22:29:58.501451 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.501432 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 22:29:58.514417 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.514396 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4mm5b"] Apr 24 22:29:58.514557 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.514541 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.518412 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.518379 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qwslk\"" Apr 24 22:29:58.518506 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.518433 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:29:58.519135 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.519119 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:29:58.526231 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.526213 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg"] Apr 24 22:29:58.526375 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.526359 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:29:58.530710 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.530692 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:29:58.530884 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.530696 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x6wvt\"" Apr 24 22:29:58.531164 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.530699 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.531404 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.531384 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.538549 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.538530 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c"] Apr 24 22:29:58.538681 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.538664 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg" Apr 24 22:29:58.545472 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.545452 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.545579 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.545530 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mfg8r\"" Apr 24 22:29:58.545741 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.545724 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.550595 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.550575 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd"] Apr 24 22:29:58.550745 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.550732 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.556975 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.556953 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 22:29:58.557455 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.557432 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 22:29:58.557548 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.557473 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 22:29:58.557548 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.557489 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 22:29:58.562585 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.562566 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb"] Apr 24 22:29:58.562754 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.562734 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565376 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565615 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565651 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565659 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565676 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565687 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565719 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3517afbe-450e-4a99-a668-a2cc8ca01cbc-config\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565755 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fngms\" (UniqueName: \"kubernetes.io/projected/b95e6492-0f90-4364-8816-060a9df92b34-kube-api-access-fngms\") pod \"volume-data-source-validator-7c6cbb6c87-dnmh6\" (UID: \"b95e6492-0f90-4364-8816-060a9df92b34\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565771 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565784 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/5d82ca42-78a8-4968-9083-5d9f43035324-kube-api-access-h5hk6\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565902 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jnq\" (UniqueName: \"kubernetes.io/projected/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-kube-api-access-k9jnq\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.565934 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-bound-sa-token\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.565905 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.566210 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86b6cf9d64-bbh5q: secret "image-registry-tls" not found Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.566295 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls podName:0d1f9a0b-38db-4713-969a-f7221408b685 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.066274579 +0000 UTC m=+34.576801772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls") pod "image-registry-86b6cf9d64-bbh5q" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685") : secret "image-registry-tls" not found Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566329 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.566876 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566418 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3517afbe-450e-4a99-a668-a2cc8ca01cbc-trusted-ca\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566453 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9804071e-5980-4f1d-95ce-ff7b5002d9d9-config\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566501 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3517afbe-450e-4a99-a668-a2cc8ca01cbc-config\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566508 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23d7baa2-f8ae-4472-abcb-860f16acd197-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566561 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566596 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3517afbe-450e-4a99-a668-a2cc8ca01cbc-serving-cert\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566621 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-image-registry-private-configuration\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566647 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d1f9a0b-38db-4713-969a-f7221408b685-ca-trust-extracted\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566676 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwj74\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-kube-api-access-fwj74\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.566705 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.06668774 +0000 UTC m=+34.577214941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : configmap references non-existent config key: service-ca.crt Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566743 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4j7q\" (UniqueName: \"kubernetes.io/projected/23d7baa2-f8ae-4472-abcb-860f16acd197-kube-api-access-v4j7q\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566775 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-registry-certificates\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566818 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45s2p\" (UniqueName: \"kubernetes.io/projected/3517afbe-450e-4a99-a668-a2cc8ca01cbc-kube-api-access-45s2p\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566836 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23d7baa2-f8ae-4472-abcb-860f16acd197-tmp\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566878 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:29:58.567699 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566914 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-default-certificate\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566943 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9lkh\" (UniqueName: \"kubernetes.io/projected/9804071e-5980-4f1d-95ce-ff7b5002d9d9-kube-api-access-s9lkh\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.566974 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23d7baa2-f8ae-4472-abcb-860f16acd197-serving-cert\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567004 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65nc4\" (UniqueName: \"kubernetes.io/projected/a46814a2-9573-4978-a715-70fdad9204e4-kube-api-access-65nc4\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567027 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9804071e-5980-4f1d-95ce-ff7b5002d9d9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567051 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23d7baa2-f8ae-4472-abcb-860f16acd197-service-ca-bundle\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567072 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23d7baa2-f8ae-4472-abcb-860f16acd197-snapshots\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567107 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-stats-auth\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567137 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567159 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-trusted-ca\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567181 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-installation-pull-secrets\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.567292 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3517afbe-450e-4a99-a668-a2cc8ca01cbc-trusted-ca\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.568354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.568203 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-registry-certificates\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.568756 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.568603 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d1f9a0b-38db-4713-969a-f7221408b685-ca-trust-extracted\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.569520 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.569342 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:29:58.569520 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.569401 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.069385633 +0000 UTC m=+34.579912824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : secret "router-metrics-certs-default" not found Apr 24 22:29:58.571584 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.571521 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.571773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.571709 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-installation-pull-secrets\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.571928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.571908 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-image-registry-private-configuration\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.572333 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.572316 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-default-certificate\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.572410 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.572351 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-stats-auth\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.572597 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.572573 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-trusted-ca\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.572911 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.572891 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3517afbe-450e-4a99-a668-a2cc8ca01cbc-serving-cert\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.577492 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.577470 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-bound-sa-token\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.577916 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.577895 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwj74\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-kube-api-access-fwj74\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:58.579743 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.579721 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jnq\" (UniqueName: \"kubernetes.io/projected/de154e2a-4cec-4799-b4c9-72ed4e2d85c4-kube-api-access-k9jnq\") pod \"kube-storage-version-migrator-operator-6769c5d45-9c9zw\" (UID: \"de154e2a-4cec-4799-b4c9-72ed4e2d85c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.581884 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.581636 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm"] Apr 24 22:29:58.581884 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.581840 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" Apr 24 22:29:58.584706 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.584684 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45s2p\" (UniqueName: \"kubernetes.io/projected/3517afbe-450e-4a99-a668-a2cc8ca01cbc-kube-api-access-45s2p\") pod \"console-operator-9d4b6777b-tbw4p\" (UID: \"3517afbe-450e-4a99-a668-a2cc8ca01cbc\") " pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.588607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.588137 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 22:29:58.588607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.588146 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-g8xh6\"" Apr 24 22:29:58.590375 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.590347 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65nc4\" (UniqueName: \"kubernetes.io/projected/a46814a2-9573-4978-a715-70fdad9204e4-kube-api-access-65nc4\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:58.605881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605824 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw"] Apr 24 22:29:58.605881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605856 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4mm5b"] Apr 24 22:29:58.605881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605869 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg"] Apr 24 22:29:58.605881 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605880 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6w7zw"] Apr 24 22:29:58.606112 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605890 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd"] Apr 24 22:29:58.606112 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605900 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm"] Apr 24 22:29:58.606112 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605914 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb"] Apr 24 22:29:58.606112 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605924 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c"] Apr 24 22:29:58.606112 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.605936 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:29:58.610266 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.610247 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 22:29:58.610371 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.610333 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-kp2cb\"" Apr 24 22:29:58.610554 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.610540 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 22:29:58.668013 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.667975 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4j7q\" (UniqueName: \"kubernetes.io/projected/23d7baa2-f8ae-4472-abcb-860f16acd197-kube-api-access-v4j7q\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.668173 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668023 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba1f6487-5042-4d0a-8bb2-4f385d224529-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.668173 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668041 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-ca\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.668173 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668152 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7t7l\" (UniqueName: \"kubernetes.io/projected/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-kube-api-access-t7t7l\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.668350 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668179 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23d7baa2-f8ae-4472-abcb-860f16acd197-tmp\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.668350 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668198 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.668350 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668220 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:29:58.668350 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668288 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps7g4\" (UniqueName: \"kubernetes.io/projected/5238b830-7ee8-4057-83b0-9eb79541d31e-kube-api-access-ps7g4\") pod \"managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb\" (UID: \"5238b830-7ee8-4057-83b0-9eb79541d31e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" Apr 24 22:29:58.668550 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668385 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:29:58.668550 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668422 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9lkh\" (UniqueName: \"kubernetes.io/projected/9804071e-5980-4f1d-95ce-ff7b5002d9d9-kube-api-access-s9lkh\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.668550 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668453 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfz7d\" (UniqueName: \"kubernetes.io/projected/9b249279-0358-4efe-bcbd-16ecb96ece58-kube-api-access-zfz7d\") pod \"network-check-source-8894fc9bd-nncrg\" (UID: \"9b249279-0358-4efe-bcbd-16ecb96ece58\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg" Apr 24 22:29:58.668550 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668493 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23d7baa2-f8ae-4472-abcb-860f16acd197-serving-cert\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.668550 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668521 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23d7baa2-f8ae-4472-abcb-860f16acd197-tmp\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.668550 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.668539 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:29:58.668873 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.668614 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls podName:5d82ca42-78a8-4968-9083-5d9f43035324 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.168595753 +0000 UTC m=+34.679122931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lwc4g" (UID: "5d82ca42-78a8-4968-9083-5d9f43035324") : secret "samples-operator-tls" not found Apr 24 22:29:58.668873 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668658 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9804071e-5980-4f1d-95ce-ff7b5002d9d9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.668873 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668706 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snkfh\" (UniqueName: \"kubernetes.io/projected/ba1f6487-5042-4d0a-8bb2-4f385d224529-kube-api-access-snkfh\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.668873 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668743 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23d7baa2-f8ae-4472-abcb-860f16acd197-service-ca-bundle\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.668873 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668770 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhj6x\" (UniqueName: \"kubernetes.io/projected/e4b92403-6523-4474-832b-2bb3cb7d7b9d-kube-api-access-zhj6x\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.668873 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668798 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-hub\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.668873 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668870 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668901 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23d7baa2-f8ae-4472-abcb-860f16acd197-snapshots\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668946 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.668987 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv2x\" (UniqueName: \"kubernetes.io/projected/6d2a4a7d-d624-412e-a43d-016ca5e90208-kube-api-access-dwv2x\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669042 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fngms\" (UniqueName: \"kubernetes.io/projected/b95e6492-0f90-4364-8816-060a9df92b34-kube-api-access-fngms\") pod \"volume-data-source-validator-7c6cbb6c87-dnmh6\" (UID: \"b95e6492-0f90-4364-8816-060a9df92b34\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669065 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/5d82ca42-78a8-4968-9083-5d9f43035324-kube-api-access-h5hk6\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669092 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5x8\" (UniqueName: \"kubernetes.io/projected/3a180131-c839-45eb-9da2-6f9ffa71d641-kube-api-access-mr5x8\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669114 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4b92403-6523-4474-832b-2bb3cb7d7b9d-config-volume\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669134 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4b92403-6523-4474-832b-2bb3cb7d7b9d-tmp-dir\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669156 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d2a4a7d-d624-412e-a43d-016ca5e90208-tmp\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.669219 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669183 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9804071e-5980-4f1d-95ce-ff7b5002d9d9-config\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.669928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669241 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23d7baa2-f8ae-4472-abcb-860f16acd197-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.669928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669286 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6d2a4a7d-d624-412e-a43d-016ca5e90208-klusterlet-config\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.669928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669298 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23d7baa2-f8ae-4472-abcb-860f16acd197-service-ca-bundle\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.669928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669313 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.669928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669335 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5238b830-7ee8-4057-83b0-9eb79541d31e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb\" (UID: \"5238b830-7ee8-4057-83b0-9eb79541d31e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" Apr 24 22:29:58.669928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669357 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.669928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.669487 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23d7baa2-f8ae-4472-abcb-860f16acd197-snapshots\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.670261 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.670007 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9804071e-5980-4f1d-95ce-ff7b5002d9d9-config\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.670261 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.670201 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23d7baa2-f8ae-4472-abcb-860f16acd197-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.671353 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.671333 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23d7baa2-f8ae-4472-abcb-860f16acd197-serving-cert\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.671456 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.671407 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9804071e-5980-4f1d-95ce-ff7b5002d9d9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.680953 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.680930 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:29:58.683497 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.683475 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9lkh\" (UniqueName: \"kubernetes.io/projected/9804071e-5980-4f1d-95ce-ff7b5002d9d9-kube-api-access-s9lkh\") pod \"service-ca-operator-d6fc45fc5-tpzdt\" (UID: \"9804071e-5980-4f1d-95ce-ff7b5002d9d9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.684015 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.683992 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4j7q\" (UniqueName: \"kubernetes.io/projected/23d7baa2-f8ae-4472-abcb-860f16acd197-kube-api-access-v4j7q\") pod \"insights-operator-585dfdc468-p6xqh\" (UID: \"23d7baa2-f8ae-4472-abcb-860f16acd197\") " pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.691833 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.691794 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fngms\" (UniqueName: \"kubernetes.io/projected/b95e6492-0f90-4364-8816-060a9df92b34-kube-api-access-fngms\") pod \"volume-data-source-validator-7c6cbb6c87-dnmh6\" (UID: \"b95e6492-0f90-4364-8816-060a9df92b34\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6" Apr 24 22:29:58.691902 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.691834 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/5d82ca42-78a8-4968-9083-5d9f43035324-kube-api-access-h5hk6\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:29:58.704790 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.704766 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" Apr 24 22:29:58.732798 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.732771 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-p6xqh" Apr 24 22:29:58.770389 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770364 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6" Apr 24 22:29:58.770635 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770606 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6d2a4a7d-d624-412e-a43d-016ca5e90208-klusterlet-config\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.770756 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770646 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.770756 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770670 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5238b830-7ee8-4057-83b0-9eb79541d31e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb\" (UID: \"5238b830-7ee8-4057-83b0-9eb79541d31e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" Apr 24 22:29:58.770756 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770735 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.770918 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770773 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba1f6487-5042-4d0a-8bb2-4f385d224529-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.770918 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770798 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-ca\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.770918 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770841 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7t7l\" (UniqueName: \"kubernetes.io/projected/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-kube-api-access-t7t7l\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.770918 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770876 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.770918 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770904 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:29:58.771078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770928 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps7g4\" (UniqueName: \"kubernetes.io/projected/5238b830-7ee8-4057-83b0-9eb79541d31e-kube-api-access-ps7g4\") pod \"managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb\" (UID: \"5238b830-7ee8-4057-83b0-9eb79541d31e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" Apr 24 22:29:58.771078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770956 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:58.771078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.770997 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:29:58.771078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771033 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfz7d\" (UniqueName: \"kubernetes.io/projected/9b249279-0358-4efe-bcbd-16ecb96ece58-kube-api-access-zfz7d\") pod \"network-check-source-8894fc9bd-nncrg\" (UID: \"9b249279-0358-4efe-bcbd-16ecb96ece58\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg" Apr 24 22:29:58.771343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771076 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snkfh\" (UniqueName: \"kubernetes.io/projected/ba1f6487-5042-4d0a-8bb2-4f385d224529-kube-api-access-snkfh\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.771343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771106 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj6x\" (UniqueName: \"kubernetes.io/projected/e4b92403-6523-4474-832b-2bb3cb7d7b9d-kube-api-access-zhj6x\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.771343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771132 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-hub\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.771343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771156 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.771343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771202 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.771343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771244 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwv2x\" (UniqueName: \"kubernetes.io/projected/6d2a4a7d-d624-412e-a43d-016ca5e90208-kube-api-access-dwv2x\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.771343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771301 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5x8\" (UniqueName: \"kubernetes.io/projected/3a180131-c839-45eb-9da2-6f9ffa71d641-kube-api-access-mr5x8\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:29:58.771343 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771329 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4b92403-6523-4474-832b-2bb3cb7d7b9d-config-volume\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.771619 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771357 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4b92403-6523-4474-832b-2bb3cb7d7b9d-tmp-dir\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.771619 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771385 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d2a4a7d-d624-412e-a43d-016ca5e90208-tmp\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.771619 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771418 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f63adb0-5994-496d-880d-9d660e539622-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:29:58.771619 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.770777 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:29:58.771619 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771501 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.771619 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.771539 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba1f6487-5042-4d0a-8bb2-4f385d224529-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.771619 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.771553 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls podName:e4b92403-6523-4474-832b-2bb3cb7d7b9d nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.271535277 +0000 UTC m=+34.782062458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls") pod "dns-default-6w7zw" (UID: "e4b92403-6523-4474-832b-2bb3cb7d7b9d") : secret "dns-default-metrics-tls" not found Apr 24 22:29:58.773078 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.772074 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:29:58.773078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.772126 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d2a4a7d-d624-412e-a43d-016ca5e90208-tmp\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.773078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.772228 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4b92403-6523-4474-832b-2bb3cb7d7b9d-config-volume\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.773078 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.773011 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls podName:ba1f6487-5042-4d0a-8bb2-4f385d224529 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.272992084 +0000 UTC m=+34.783519278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59n7s" (UID: "ba1f6487-5042-4d0a-8bb2-4f385d224529") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:29:58.773078 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.772129 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:29:58.773078 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.773080 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert podName:3a180131-c839-45eb-9da2-6f9ffa71d641 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.273062287 +0000 UTC m=+34.783589478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert") pod "ingress-canary-4mm5b" (UID: "3a180131-c839-45eb-9da2-6f9ffa71d641") : secret "canary-serving-cert" not found Apr 24 22:29:58.773078 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.772289 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:58.773357 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.773124 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs podName:c19cc309-d892-45ed-a3cd-43a98273bafb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.773113561 +0000 UTC m=+66.283640747 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs") pod "network-metrics-daemon-44r7l" (UID: "c19cc309-d892-45ed-a3cd-43a98273bafb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:58.774276 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.774237 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5238b830-7ee8-4057-83b0-9eb79541d31e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb\" (UID: \"5238b830-7ee8-4057-83b0-9eb79541d31e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" Apr 24 22:29:58.774381 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.774359 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-hub\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.774777 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.774755 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6d2a4a7d-d624-412e-a43d-016ca5e90208-klusterlet-config\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.775157 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.775131 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-ca\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.775989 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.775967 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.782059 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.782032 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4b92403-6523-4474-832b-2bb3cb7d7b9d-tmp-dir\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.782433 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.782372 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhj6x\" (UniqueName: \"kubernetes.io/projected/e4b92403-6523-4474-832b-2bb3cb7d7b9d-kube-api-access-zhj6x\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:58.784545 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.784519 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7t7l\" (UniqueName: \"kubernetes.io/projected/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-kube-api-access-t7t7l\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.784646 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.784604 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7d55577796-jwfxd\" (UID: \"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.784711 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.784678 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5x8\" (UniqueName: \"kubernetes.io/projected/3a180131-c839-45eb-9da2-6f9ffa71d641-kube-api-access-mr5x8\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:29:58.784858 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.784838 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfz7d\" (UniqueName: \"kubernetes.io/projected/9b249279-0358-4efe-bcbd-16ecb96ece58-kube-api-access-zfz7d\") pod \"network-check-source-8894fc9bd-nncrg\" (UID: \"9b249279-0358-4efe-bcbd-16ecb96ece58\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg" Apr 24 22:29:58.785472 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.785430 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwv2x\" (UniqueName: \"kubernetes.io/projected/6d2a4a7d-d624-412e-a43d-016ca5e90208-kube-api-access-dwv2x\") pod \"klusterlet-addon-workmgr-54c84c649f-5bc4c\" (UID: \"6d2a4a7d-d624-412e-a43d-016ca5e90208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.785554 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.785472 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps7g4\" (UniqueName: \"kubernetes.io/projected/5238b830-7ee8-4057-83b0-9eb79541d31e-kube-api-access-ps7g4\") pod \"managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb\" (UID: \"5238b830-7ee8-4057-83b0-9eb79541d31e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" Apr 24 22:29:58.788077 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.788058 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" Apr 24 22:29:58.790313 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.790294 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snkfh\" (UniqueName: \"kubernetes.io/projected/ba1f6487-5042-4d0a-8bb2-4f385d224529-kube-api-access-snkfh\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:58.848921 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.848888 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg" Apr 24 22:29:58.860766 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.860704 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:29:58.872667 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.872642 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f63adb0-5994-496d-880d-9d660e539622-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:29:58.872779 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.872686 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:58.872856 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.872786 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:29:58.872987 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.872970 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:29:58.873057 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:58.873042 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert podName:9f63adb0-5994-496d-880d-9d660e539622 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.373023197 +0000 UTC m=+34.883550379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9b2dm" (UID: "9f63adb0-5994-496d-880d-9d660e539622") : secret "networking-console-plugin-cert" not found Apr 24 22:29:58.873384 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.873367 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f63adb0-5994-496d-880d-9d660e539622-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:29:58.875676 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.875657 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd2h4\" (UniqueName: \"kubernetes.io/projected/df25b403-ced4-4c31-9691-1da44a52f2a0-kube-api-access-fd2h4\") pod \"network-check-target-k24sc\" (UID: \"df25b403-ced4-4c31-9691-1da44a52f2a0\") " pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:29:58.920147 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.920110 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:29:58.927938 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:58.927916 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" Apr 24 22:29:59.074009 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.073969 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:59.074189 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.074020 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:29:59.074189 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.074056 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:29:59.074189 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.074135 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:29:59.074340 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.074190 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:29:59.074340 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.074206 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86b6cf9d64-bbh5q: secret "image-registry-tls" not found Apr 24 22:29:59.074340 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.074214 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.074181909 +0000 UTC m=+35.584709102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : secret "router-metrics-certs-default" not found Apr 24 22:29:59.074340 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.074235 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls podName:0d1f9a0b-38db-4713-969a-f7221408b685 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.07422344 +0000 UTC m=+35.584750622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls") pod "image-registry-86b6cf9d64-bbh5q" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685") : secret "image-registry-tls" not found Apr 24 22:29:59.074340 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.074249 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.074241418 +0000 UTC m=+35.584768596 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : configmap references non-existent config key: service-ca.crt Apr 24 22:29:59.100146 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.100116 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:29:59.103757 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.103517 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:29:59.103757 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.103530 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-74npb\"" Apr 24 22:29:59.175465 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.175383 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:29:59.175620 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.175557 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:29:59.175693 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.175681 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls podName:5d82ca42-78a8-4968-9083-5d9f43035324 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.175658506 +0000 UTC m=+35.686185707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lwc4g" (UID: "5d82ca42-78a8-4968-9083-5d9f43035324") : secret "samples-operator-tls" not found Apr 24 22:29:59.276395 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.276353 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:29:59.276558 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.276479 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:29:59.276558 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.276502 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:29:59.276558 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.276550 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:29:59.276692 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.276576 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert podName:3a180131-c839-45eb-9da2-6f9ffa71d641 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.276555071 +0000 UTC m=+35.787082270 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert") pod "ingress-canary-4mm5b" (UID: "3a180131-c839-45eb-9da2-6f9ffa71d641") : secret "canary-serving-cert" not found Apr 24 22:29:59.276692 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.276624 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:29:59.276692 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.276678 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls podName:e4b92403-6523-4474-832b-2bb3cb7d7b9d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.276664302 +0000 UTC m=+35.787191483 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls") pod "dns-default-6w7zw" (UID: "e4b92403-6523-4474-832b-2bb3cb7d7b9d") : secret "dns-default-metrics-tls" not found Apr 24 22:29:59.276692 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.276678 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:29:59.276905 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.276729 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls podName:ba1f6487-5042-4d0a-8bb2-4f385d224529 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.2767167 +0000 UTC m=+35.787243880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59n7s" (UID: "ba1f6487-5042-4d0a-8bb2-4f385d224529") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:29:59.378016 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:29:59.377969 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:29:59.378233 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.378151 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:29:59.378298 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:29:59.378235 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert podName:9f63adb0-5994-496d-880d-9d660e539622 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.378214553 +0000 UTC m=+35.888741748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9b2dm" (UID: "9f63adb0-5994-496d-880d-9d660e539622") : secret "networking-console-plugin-cert" not found Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.087237 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.087540 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.087620 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.087719 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.087797 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.087774748 +0000 UTC m=+37.598301941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : secret "router-metrics-certs-default" not found Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.087845 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.087834333 +0000 UTC m=+37.598361525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.087877 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.087895 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86b6cf9d64-bbh5q: secret "image-registry-tls" not found Apr 24 22:30:00.087986 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.087944 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls podName:0d1f9a0b-38db-4713-969a-f7221408b685 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.087923262 +0000 UTC m=+37.598450460 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls") pod "image-registry-86b6cf9d64-bbh5q" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685") : secret "image-registry-tls" not found Apr 24 22:30:00.099775 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.099727 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:30:00.101320 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.100904 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:30:00.105245 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.104792 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:30:00.108036 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.107172 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-499pd\"" Apr 24 22:30:00.140926 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.140867 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c"] Apr 24 22:30:00.141581 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.141539 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt"] Apr 24 22:30:00.151858 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.151467 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:30:00.173996 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.173970 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw"] Apr 24 22:30:00.176999 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.176975 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb"] Apr 24 22:30:00.178410 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:00.178380 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde154e2a_4cec_4799_b4c9_72ed4e2d85c4.slice/crio-f4322485cc37b6ffd37e2cde11830abd614b8741266d8ed7c3c031bdf3ab1cdc WatchSource:0}: Error finding container f4322485cc37b6ffd37e2cde11830abd614b8741266d8ed7c3c031bdf3ab1cdc: Status 404 returned error can't find the container with id f4322485cc37b6ffd37e2cde11830abd614b8741266d8ed7c3c031bdf3ab1cdc Apr 24 22:30:00.180295 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.180270 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6"] Apr 24 22:30:00.180541 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:00.180517 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5238b830_7ee8_4057_83b0_9eb79541d31e.slice/crio-062be11aff08badd37d3be0cee42b9520f8a32d7ab1bd924ff8cf79af4035d82 WatchSource:0}: Error finding container 062be11aff08badd37d3be0cee42b9520f8a32d7ab1bd924ff8cf79af4035d82: Status 404 returned error can't find the container with id 062be11aff08badd37d3be0cee42b9520f8a32d7ab1bd924ff8cf79af4035d82 Apr 24 22:30:00.184400 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:00.184380 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb95e6492_0f90_4364_8816_060a9df92b34.slice/crio-3e92a4c0b6d4ac7fbe3eb86fb3a1eb708774867dabc70f2ddbbf15e31bc074f1 WatchSource:0}: Error finding container 3e92a4c0b6d4ac7fbe3eb86fb3a1eb708774867dabc70f2ddbbf15e31bc074f1: Status 404 returned error can't find the container with id 3e92a4c0b6d4ac7fbe3eb86fb3a1eb708774867dabc70f2ddbbf15e31bc074f1 Apr 24 22:30:00.188300 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.188276 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:30:00.188450 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.188432 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:00.188516 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.188504 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls podName:5d82ca42-78a8-4968-9083-5d9f43035324 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.188482865 +0000 UTC m=+37.699010048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lwc4g" (UID: "5d82ca42-78a8-4968-9083-5d9f43035324") : secret "samples-operator-tls" not found Apr 24 22:30:00.196986 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.194501 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd"] Apr 24 22:30:00.199437 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.198584 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-p6xqh"] Apr 24 22:30:00.200260 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:00.199962 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dde0bf2_7411_47ba_a4fd_5ac9e3c8eeeb.slice/crio-1014d2527dd85dddf5181b0efba035de4f74e08fc524b3b66b86963f490a5c36 WatchSource:0}: Error finding container 1014d2527dd85dddf5181b0efba035de4f74e08fc524b3b66b86963f490a5c36: Status 404 returned error can't find the container with id 1014d2527dd85dddf5181b0efba035de4f74e08fc524b3b66b86963f490a5c36 Apr 24 22:30:00.200260 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.199998 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tbw4p"] Apr 24 22:30:00.200595 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.200552 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg"] Apr 24 22:30:00.202587 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:00.202558 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3517afbe_450e_4a99_a668_a2cc8ca01cbc.slice/crio-ee0eec904188e3ff60b8c67c720b3186e828f79990326733a241bcdd88bb8def WatchSource:0}: Error finding container ee0eec904188e3ff60b8c67c720b3186e828f79990326733a241bcdd88bb8def: Status 404 returned error can't find the container with id ee0eec904188e3ff60b8c67c720b3186e828f79990326733a241bcdd88bb8def Apr 24 22:30:00.246398 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.246359 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" event={"ID":"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb","Type":"ContainerStarted","Data":"1014d2527dd85dddf5181b0efba035de4f74e08fc524b3b66b86963f490a5c36"} Apr 24 22:30:00.249129 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.249069 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-p6xqh" event={"ID":"23d7baa2-f8ae-4472-abcb-860f16acd197","Type":"ContainerStarted","Data":"b0ea317e97f5fd160c01bbea7f7e02d162a09cfd318c83bd6228f25e8f7edf28"} Apr 24 22:30:00.250442 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.250407 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" event={"ID":"6d2a4a7d-d624-412e-a43d-016ca5e90208","Type":"ContainerStarted","Data":"7e216a3d9ce3914aafdd12b4f200cb8ebba686798e690139c757e10437f3da92"} Apr 24 22:30:00.251789 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.251763 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" event={"ID":"3517afbe-450e-4a99-a668-a2cc8ca01cbc","Type":"ContainerStarted","Data":"ee0eec904188e3ff60b8c67c720b3186e828f79990326733a241bcdd88bb8def"} Apr 24 22:30:00.253170 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.253117 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6" event={"ID":"b95e6492-0f90-4364-8816-060a9df92b34","Type":"ContainerStarted","Data":"3e92a4c0b6d4ac7fbe3eb86fb3a1eb708774867dabc70f2ddbbf15e31bc074f1"} Apr 24 22:30:00.254451 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.254423 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg" event={"ID":"9b249279-0358-4efe-bcbd-16ecb96ece58","Type":"ContainerStarted","Data":"885c653419c828f3d2946e7b3bd6320e2f3418343b2486d497651c8f24f7267e"} Apr 24 22:30:00.255551 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.255517 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" event={"ID":"9804071e-5980-4f1d-95ce-ff7b5002d9d9","Type":"ContainerStarted","Data":"b564d184792a2fb3fa991ab05332de238f6ab50367a209fbb0f096c516d61376"} Apr 24 22:30:00.257875 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.257849 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" event={"ID":"de154e2a-4cec-4799-b4c9-72ed4e2d85c4","Type":"ContainerStarted","Data":"f4322485cc37b6ffd37e2cde11830abd614b8741266d8ed7c3c031bdf3ab1cdc"} Apr 24 22:30:00.259077 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.259053 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" event={"ID":"5238b830-7ee8-4057-83b0-9eb79541d31e","Type":"ContainerStarted","Data":"062be11aff08badd37d3be0cee42b9520f8a32d7ab1bd924ff8cf79af4035d82"} Apr 24 22:30:00.289033 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.288993 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:00.289147 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.289063 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:30:00.289215 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.289180 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:00.289215 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.289191 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:00.289306 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.289268 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert podName:3a180131-c839-45eb-9da2-6f9ffa71d641 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.289216752 +0000 UTC m=+37.799743931 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert") pod "ingress-canary-4mm5b" (UID: "3a180131-c839-45eb-9da2-6f9ffa71d641") : secret "canary-serving-cert" not found Apr 24 22:30:00.289306 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.289285 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls podName:e4b92403-6523-4474-832b-2bb3cb7d7b9d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.289278406 +0000 UTC m=+37.799805592 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls") pod "dns-default-6w7zw" (UID: "e4b92403-6523-4474-832b-2bb3cb7d7b9d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:00.289401 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.289323 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:30:00.289449 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.289431 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:00.289501 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.289479 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls podName:ba1f6487-5042-4d0a-8bb2-4f385d224529 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.289461477 +0000 UTC m=+37.799988670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59n7s" (UID: "ba1f6487-5042-4d0a-8bb2-4f385d224529") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:00.308379 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.308344 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k24sc"] Apr 24 22:30:00.312048 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:00.312027 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf25b403_ced4_4c31_9691_1da44a52f2a0.slice/crio-29e0c754896612d5273be90395540aea3cf35c6a0d129f55095b8b5a3c493c6e WatchSource:0}: Error finding container 29e0c754896612d5273be90395540aea3cf35c6a0d129f55095b8b5a3c493c6e: Status 404 returned error can't find the container with id 29e0c754896612d5273be90395540aea3cf35c6a0d129f55095b8b5a3c493c6e Apr 24 22:30:00.390103 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:00.390009 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:30:00.390243 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.390154 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:00.390243 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:00.390222 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert podName:9f63adb0-5994-496d-880d-9d660e539622 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.390206511 +0000 UTC m=+37.900733689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9b2dm" (UID: "9f63adb0-5994-496d-880d-9d660e539622") : secret "networking-console-plugin-cert" not found Apr 24 22:30:01.273966 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:01.273925 2582 generic.go:358] "Generic (PLEG): container finished" podID="acca45df-62e2-4002-8d37-055685b49029" containerID="3c8a8867c1b6c32b28db192d92e84d352c2aff91a27dafd364335743feadfda5" exitCode=0 Apr 24 22:30:01.274435 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:01.274045 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" event={"ID":"acca45df-62e2-4002-8d37-055685b49029","Type":"ContainerDied","Data":"3c8a8867c1b6c32b28db192d92e84d352c2aff91a27dafd364335743feadfda5"} Apr 24 22:30:01.284047 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:01.283953 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k24sc" event={"ID":"df25b403-ced4-4c31-9691-1da44a52f2a0","Type":"ContainerStarted","Data":"29e0c754896612d5273be90395540aea3cf35c6a0d129f55095b8b5a3c493c6e"} Apr 24 22:30:02.113877 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.113254 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:02.113877 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.113366 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.11334217 +0000 UTC m=+41.623869361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : secret "router-metrics-certs-default" not found Apr 24 22:30:02.118149 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.115822 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:02.118149 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.115933 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:02.118149 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.116014 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:02.118149 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.116344 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.116319024 +0000 UTC m=+41.626846207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:02.118149 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.116438 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:02.118149 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.116450 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86b6cf9d64-bbh5q: secret "image-registry-tls" not found Apr 24 22:30:02.118149 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.116493 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls podName:0d1f9a0b-38db-4713-969a-f7221408b685 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.116480268 +0000 UTC m=+41.627007450 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls") pod "image-registry-86b6cf9d64-bbh5q" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685") : secret "image-registry-tls" not found Apr 24 22:30:02.218161 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.217483 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:30:02.218161 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.217671 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:02.218161 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.217756 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls podName:5d82ca42-78a8-4968-9083-5d9f43035324 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.21773437 +0000 UTC m=+41.728261553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lwc4g" (UID: "5d82ca42-78a8-4968-9083-5d9f43035324") : secret "samples-operator-tls" not found Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.318850 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.318986 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.319040 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.319238 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.319306 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert podName:3a180131-c839-45eb-9da2-6f9ffa71d641 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.319287922 +0000 UTC m=+41.829815102 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert") pod "ingress-canary-4mm5b" (UID: "3a180131-c839-45eb-9da2-6f9ffa71d641") : secret "canary-serving-cert" not found Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.319705 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.319756 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls podName:ba1f6487-5042-4d0a-8bb2-4f385d224529 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.319741059 +0000 UTC m=+41.830268250 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59n7s" (UID: "ba1f6487-5042-4d0a-8bb2-4f385d224529") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.319834 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:02.319901 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.319866 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls podName:e4b92403-6523-4474-832b-2bb3cb7d7b9d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.319855374 +0000 UTC m=+41.830382555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls") pod "dns-default-6w7zw" (UID: "e4b92403-6523-4474-832b-2bb3cb7d7b9d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:02.349082 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.347997 2582 generic.go:358] "Generic (PLEG): container finished" podID="acca45df-62e2-4002-8d37-055685b49029" containerID="7245d2665f7986726cc30c5f042053abfc63d7f3fa60943a4010c01a9f92a70d" exitCode=0 Apr 24 22:30:02.349082 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.348092 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" event={"ID":"acca45df-62e2-4002-8d37-055685b49029","Type":"ContainerDied","Data":"7245d2665f7986726cc30c5f042053abfc63d7f3fa60943a4010c01a9f92a70d"} Apr 24 22:30:02.421033 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:02.420237 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:30:02.421033 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.420388 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:02.421033 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:02.420454 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert podName:9f63adb0-5994-496d-880d-9d660e539622 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.420434068 +0000 UTC m=+41.930961275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9b2dm" (UID: "9f63adb0-5994-496d-880d-9d660e539622") : secret "networking-console-plugin-cert" not found Apr 24 22:30:06.158417 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:06.158379 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:06.158947 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:06.158548 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:06.158947 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.158578 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.158554262 +0000 UTC m=+49.669081452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:06.158947 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.158644 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:06.158947 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:06.158645 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:06.158947 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.158696 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.158680833 +0000 UTC m=+49.669208011 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : secret "router-metrics-certs-default" not found Apr 24 22:30:06.158947 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.158774 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:06.158947 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.158788 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86b6cf9d64-bbh5q: secret "image-registry-tls" not found Apr 24 22:30:06.158947 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.158878 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls podName:0d1f9a0b-38db-4713-969a-f7221408b685 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.158864537 +0000 UTC m=+49.669391720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls") pod "image-registry-86b6cf9d64-bbh5q" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685") : secret "image-registry-tls" not found Apr 24 22:30:06.259173 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:06.259087 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:30:06.259347 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.259234 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:06.259347 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.259312 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls podName:5d82ca42-78a8-4968-9083-5d9f43035324 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.259290722 +0000 UTC m=+49.769817913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lwc4g" (UID: "5d82ca42-78a8-4968-9083-5d9f43035324") : secret "samples-operator-tls" not found Apr 24 22:30:06.359832 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:06.359770 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:06.359996 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:06.359864 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:30:06.359996 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.359923 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:06.359996 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:06.359947 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:30:06.359996 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.359983 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:06.359996 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.359995 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls podName:e4b92403-6523-4474-832b-2bb3cb7d7b9d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.359974654 +0000 UTC m=+49.870501851 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls") pod "dns-default-6w7zw" (UID: "e4b92403-6523-4474-832b-2bb3cb7d7b9d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:06.360267 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.360017 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert podName:3a180131-c839-45eb-9da2-6f9ffa71d641 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.360007017 +0000 UTC m=+49.870534196 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert") pod "ingress-canary-4mm5b" (UID: "3a180131-c839-45eb-9da2-6f9ffa71d641") : secret "canary-serving-cert" not found Apr 24 22:30:06.360267 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.360045 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:06.360267 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.360110 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls podName:ba1f6487-5042-4d0a-8bb2-4f385d224529 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.360093398 +0000 UTC m=+49.870620593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59n7s" (UID: "ba1f6487-5042-4d0a-8bb2-4f385d224529") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:06.460592 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:06.460540 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:30:06.460736 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.460682 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:06.460772 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:06.460746 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert podName:9f63adb0-5994-496d-880d-9d660e539622 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.460731739 +0000 UTC m=+49.971258921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9b2dm" (UID: "9f63adb0-5994-496d-880d-9d660e539622") : secret "networking-console-plugin-cert" not found Apr 24 22:30:12.720158 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:12.720133 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:30:12.724195 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:12.724175 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae28b2f3-d733-438a-8a82-1ea82ac5ac63-original-pull-secret\") pod \"global-pull-secret-syncer-k6bhr\" (UID: \"ae28b2f3-d733-438a-8a82-1ea82ac5ac63\") " pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:30:12.744208 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:12.744183 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k6bhr" Apr 24 22:30:12.925279 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:12.925199 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k6bhr"] Apr 24 22:30:12.936594 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:12.936555 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae28b2f3_d733_438a_8a82_1ea82ac5ac63.slice/crio-a948d9625e8eb71373c54fb7168870a5450d9dc1bc148d70465189dfe0387688 WatchSource:0}: Error finding container a948d9625e8eb71373c54fb7168870a5450d9dc1bc148d70465189dfe0387688: Status 404 returned error can't find the container with id a948d9625e8eb71373c54fb7168870a5450d9dc1bc148d70465189dfe0387688 Apr 24 22:30:13.383256 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.383199 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6" event={"ID":"b95e6492-0f90-4364-8816-060a9df92b34","Type":"ContainerStarted","Data":"0a49154a6ec23d354a469a5876694b2cb5d51cb94d7c5ac8e67817621ca76b7e"} Apr 24 22:30:13.385357 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.384868 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg" event={"ID":"9b249279-0358-4efe-bcbd-16ecb96ece58","Type":"ContainerStarted","Data":"a6b7b019ed06a263e161091bc174fc9a26dc54bb9e8b1f3f17a8c510e07e644b"} Apr 24 22:30:13.387226 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.387196 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" event={"ID":"9804071e-5980-4f1d-95ce-ff7b5002d9d9","Type":"ContainerStarted","Data":"cc9e1ae6e2e89d0c811a4322f82371ec931f891d1137c11f5e00179763964f99"} Apr 24 22:30:13.388948 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.388900 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" event={"ID":"de154e2a-4cec-4799-b4c9-72ed4e2d85c4","Type":"ContainerStarted","Data":"4fffa5408c1a8371d219e5cb75330c0e8373f1517c27eab431aa846c354165ac"} Apr 24 22:30:13.390745 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.390722 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" event={"ID":"5238b830-7ee8-4057-83b0-9eb79541d31e","Type":"ContainerStarted","Data":"82292b61a5280639cf9b5c06a312b2bd6a7d55a73f100a15462487d43994dd84"} Apr 24 22:30:13.392190 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.392155 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" event={"ID":"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb","Type":"ContainerStarted","Data":"985158d06ce866537d6b7747e1b8c40773bbdc54a39c26277bac74175ec686a9"} Apr 24 22:30:13.393733 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.393625 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-p6xqh" event={"ID":"23d7baa2-f8ae-4472-abcb-860f16acd197","Type":"ContainerStarted","Data":"804f49197559c4922c77561e581565c0e020822a82cf569d190954558e89321c"} Apr 24 22:30:13.395159 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.395136 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k24sc" event={"ID":"df25b403-ced4-4c31-9691-1da44a52f2a0","Type":"ContainerStarted","Data":"c7df1b22b128522e4cf9e3f878c66301b3c96845a7cb763ab5cb5a00cd30677a"} Apr 24 22:30:13.395390 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.395371 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:30:13.398453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.398432 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" event={"ID":"acca45df-62e2-4002-8d37-055685b49029","Type":"ContainerStarted","Data":"e5ef5dbf38dc6d827dfc18daeba4d483385fbe32e12481c666de42a53736c545"} Apr 24 22:30:13.399759 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.399733 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" event={"ID":"6d2a4a7d-d624-412e-a43d-016ca5e90208","Type":"ContainerStarted","Data":"96676eaaf8ea3be7b7edd272b5694d4ab7ea48ab0c40cba6933867d073b3d85e"} Apr 24 22:30:13.400360 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.400343 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:30:13.402082 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.402063 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/0.log" Apr 24 22:30:13.402206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.402097 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" Apr 24 22:30:13.402206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.402121 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" event={"ID":"3517afbe-450e-4a99-a668-a2cc8ca01cbc","Type":"ContainerDied","Data":"71c1ad680fff688f639936fb6e381ae2f0e2e95d196c5e3fb6f14f68a43a2b5d"} Apr 24 22:30:13.402206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.402095 2582 generic.go:358] "Generic (PLEG): container finished" podID="3517afbe-450e-4a99-a668-a2cc8ca01cbc" containerID="71c1ad680fff688f639936fb6e381ae2f0e2e95d196c5e3fb6f14f68a43a2b5d" exitCode=255 Apr 24 22:30:13.403199 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.402658 2582 scope.go:117] "RemoveContainer" containerID="71c1ad680fff688f639936fb6e381ae2f0e2e95d196c5e3fb6f14f68a43a2b5d" Apr 24 22:30:13.405627 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.405607 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k6bhr" event={"ID":"ae28b2f3-d733-438a-8a82-1ea82ac5ac63","Type":"ContainerStarted","Data":"a948d9625e8eb71373c54fb7168870a5450d9dc1bc148d70465189dfe0387688"} Apr 24 22:30:13.415006 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.414953 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dnmh6" podStartSLOduration=10.189788607 podStartE2EDuration="22.414936731s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.186440446 +0000 UTC m=+35.696967625" lastFinishedPulling="2026-04-24 22:30:12.411588541 +0000 UTC m=+47.922115749" observedRunningTime="2026-04-24 22:30:13.412629506 +0000 UTC m=+48.923156709" watchObservedRunningTime="2026-04-24 22:30:13.414936731 +0000 UTC m=+48.925463935" Apr 24 22:30:13.459759 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.459715 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-k24sc" podStartSLOduration=36.066286636 podStartE2EDuration="48.45970184s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.313936669 +0000 UTC m=+35.824463849" lastFinishedPulling="2026-04-24 22:30:12.707351677 +0000 UTC m=+48.217879053" observedRunningTime="2026-04-24 22:30:13.458999219 +0000 UTC m=+48.969526419" watchObservedRunningTime="2026-04-24 22:30:13.45970184 +0000 UTC m=+48.970229040" Apr 24 22:30:13.623332 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.623277 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" podStartSLOduration=10.298598174 podStartE2EDuration="22.623257039s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.180449083 +0000 UTC m=+35.690976272" lastFinishedPulling="2026-04-24 22:30:12.505107959 +0000 UTC m=+48.015635137" observedRunningTime="2026-04-24 22:30:13.569750186 +0000 UTC m=+49.080277396" watchObservedRunningTime="2026-04-24 22:30:13.623257039 +0000 UTC m=+49.133784241" Apr 24 22:30:13.671927 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.671499 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-p6xqh" podStartSLOduration=10.366497522 podStartE2EDuration="22.671477462s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.200126681 +0000 UTC m=+35.710653859" lastFinishedPulling="2026-04-24 22:30:12.505106603 +0000 UTC m=+48.015633799" observedRunningTime="2026-04-24 22:30:13.621965343 +0000 UTC m=+49.132492542" watchObservedRunningTime="2026-04-24 22:30:13.671477462 +0000 UTC m=+49.182004662" Apr 24 22:30:13.766101 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.766040 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-54c84c649f-5bc4c" podStartSLOduration=28.396889281 podStartE2EDuration="40.765995222s" podCreationTimestamp="2026-04-24 22:29:33 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.152691652 +0000 UTC m=+35.663218831" lastFinishedPulling="2026-04-24 22:30:12.521797586 +0000 UTC m=+48.032324772" observedRunningTime="2026-04-24 22:30:13.673698006 +0000 UTC m=+49.184225220" watchObservedRunningTime="2026-04-24 22:30:13.765995222 +0000 UTC m=+49.276522423" Apr 24 22:30:13.766568 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.766314 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n5rkg" podStartSLOduration=16.399830591 podStartE2EDuration="48.766305323s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:29:27.792421279 +0000 UTC m=+3.302948474" lastFinishedPulling="2026-04-24 22:30:00.158896014 +0000 UTC m=+35.669423206" observedRunningTime="2026-04-24 22:30:13.765481623 +0000 UTC m=+49.276008826" watchObservedRunningTime="2026-04-24 22:30:13.766305323 +0000 UTC m=+49.276832526" Apr 24 22:30:13.872842 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.872421 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" podStartSLOduration=10.520547595 podStartE2EDuration="22.872406991s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.153244231 +0000 UTC m=+35.663771414" lastFinishedPulling="2026-04-24 22:30:12.50510363 +0000 UTC m=+48.015630810" observedRunningTime="2026-04-24 22:30:13.869030429 +0000 UTC m=+49.379557631" watchObservedRunningTime="2026-04-24 22:30:13.872406991 +0000 UTC m=+49.382934190" Apr 24 22:30:13.873243 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:13.873199 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ddc9bcc99-6lfqb" podStartSLOduration=28.340320745 podStartE2EDuration="40.873189358s" podCreationTimestamp="2026-04-24 22:29:33 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.182640821 +0000 UTC m=+35.693168013" lastFinishedPulling="2026-04-24 22:30:12.715509448 +0000 UTC m=+48.226036626" observedRunningTime="2026-04-24 22:30:13.831959108 +0000 UTC m=+49.342486319" watchObservedRunningTime="2026-04-24 22:30:13.873189358 +0000 UTC m=+49.383716558" Apr 24 22:30:14.237746 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.237702 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:14.237959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.237765 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:14.237959 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.237837 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:14.237959 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.237882 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:14.237959 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.237945 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.237927531 +0000 UTC m=+65.748454726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : secret "router-metrics-certs-default" not found Apr 24 22:30:14.238181 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.237992 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle podName:a46814a2-9573-4978-a715-70fdad9204e4 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.237982112 +0000 UTC m=+65.748509290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle") pod "router-default-74654b95d8-zp62l" (UID: "a46814a2-9573-4978-a715-70fdad9204e4") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:14.238181 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.238101 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:14.238181 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.238113 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86b6cf9d64-bbh5q: secret "image-registry-tls" not found Apr 24 22:30:14.238181 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.238143 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls podName:0d1f9a0b-38db-4713-969a-f7221408b685 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.238132212 +0000 UTC m=+65.748659393 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls") pod "image-registry-86b6cf9d64-bbh5q" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685") : secret "image-registry-tls" not found Apr 24 22:30:14.338564 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.338525 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:30:14.338867 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.338682 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:14.338867 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.338749 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls podName:5d82ca42-78a8-4968-9083-5d9f43035324 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.338732777 +0000 UTC m=+65.849259955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lwc4g" (UID: "5d82ca42-78a8-4968-9083-5d9f43035324") : secret "samples-operator-tls" not found Apr 24 22:30:14.412355 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.412321 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:30:14.414582 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.412747 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/0.log" Apr 24 22:30:14.414582 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.412784 2582 generic.go:358] "Generic (PLEG): container finished" podID="3517afbe-450e-4a99-a668-a2cc8ca01cbc" containerID="fee5dd84ac009c87a0ec4f2c1d942d57a9026f28820ba977ec5fdc96a4df030a" exitCode=255 Apr 24 22:30:14.414582 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.413648 2582 scope.go:117] "RemoveContainer" containerID="fee5dd84ac009c87a0ec4f2c1d942d57a9026f28820ba977ec5fdc96a4df030a" Apr 24 22:30:14.414582 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.413851 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tbw4p_openshift-console-operator(3517afbe-450e-4a99-a668-a2cc8ca01cbc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" podUID="3517afbe-450e-4a99-a668-a2cc8ca01cbc" Apr 24 22:30:14.414582 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.413893 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" event={"ID":"3517afbe-450e-4a99-a668-a2cc8ca01cbc","Type":"ContainerDied","Data":"fee5dd84ac009c87a0ec4f2c1d942d57a9026f28820ba977ec5fdc96a4df030a"} Apr 24 22:30:14.414582 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.413923 2582 scope.go:117] "RemoveContainer" containerID="71c1ad680fff688f639936fb6e381ae2f0e2e95d196c5e3fb6f14f68a43a2b5d" Apr 24 22:30:14.439445 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.439415 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:30:14.439919 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.439903 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:14.441488 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.440116 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:14.441488 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.440181 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls podName:ba1f6487-5042-4d0a-8bb2-4f385d224529 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.440160994 +0000 UTC m=+65.950688179 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59n7s" (UID: "ba1f6487-5042-4d0a-8bb2-4f385d224529") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:14.441488 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.440199 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:14.441488 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.440244 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert podName:3a180131-c839-45eb-9da2-6f9ffa71d641 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.440228749 +0000 UTC m=+65.950755936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert") pod "ingress-canary-4mm5b" (UID: "3a180131-c839-45eb-9da2-6f9ffa71d641") : secret "canary-serving-cert" not found Apr 24 22:30:14.441488 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.440122 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:30:14.441488 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.441395 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:14.441488 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.441465 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls podName:e4b92403-6523-4474-832b-2bb3cb7d7b9d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.441450634 +0000 UTC m=+65.951977815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls") pod "dns-default-6w7zw" (UID: "e4b92403-6523-4474-832b-2bb3cb7d7b9d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:14.491236 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.488539 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nncrg" podStartSLOduration=11.003496395 podStartE2EDuration="23.488522065s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.206262357 +0000 UTC m=+35.716789534" lastFinishedPulling="2026-04-24 22:30:12.691288007 +0000 UTC m=+48.201815204" observedRunningTime="2026-04-24 22:30:13.90223125 +0000 UTC m=+49.412758451" watchObservedRunningTime="2026-04-24 22:30:14.488522065 +0000 UTC m=+49.999049266" Apr 24 22:30:14.547707 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:14.547658 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:30:14.547926 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.547853 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:14.547998 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:14.547937 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert podName:9f63adb0-5994-496d-880d-9d660e539622 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.54791589 +0000 UTC m=+66.058443083 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9b2dm" (UID: "9f63adb0-5994-496d-880d-9d660e539622") : secret "networking-console-plugin-cert" not found Apr 24 22:30:15.220492 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.220446 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mpcwd"] Apr 24 22:30:15.255868 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.255843 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mpcwd"] Apr 24 22:30:15.256023 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.255987 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.260262 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.260242 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:30:15.261065 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.261043 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:30:15.261180 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.261161 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qbjm6\"" Apr 24 22:30:15.355354 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.355319 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fdbacdd2-1641-4b55-904b-22b681419f52-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.355562 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.355411 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fdbacdd2-1641-4b55-904b-22b681419f52-data-volume\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.355562 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.355476 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fdbacdd2-1641-4b55-904b-22b681419f52-crio-socket\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.355562 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.355503 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62h2x\" (UniqueName: \"kubernetes.io/projected/fdbacdd2-1641-4b55-904b-22b681419f52-kube-api-access-62h2x\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.355562 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.355534 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.419363 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.419334 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:30:15.419825 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.419791 2582 scope.go:117] "RemoveContainer" containerID="fee5dd84ac009c87a0ec4f2c1d942d57a9026f28820ba977ec5fdc96a4df030a" Apr 24 22:30:15.420058 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:15.420038 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tbw4p_openshift-console-operator(3517afbe-450e-4a99-a668-a2cc8ca01cbc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" podUID="3517afbe-450e-4a99-a668-a2cc8ca01cbc" Apr 24 22:30:15.456786 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.456747 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fdbacdd2-1641-4b55-904b-22b681419f52-data-volume\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.457078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.456887 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fdbacdd2-1641-4b55-904b-22b681419f52-crio-socket\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.457078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.456915 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62h2x\" (UniqueName: \"kubernetes.io/projected/fdbacdd2-1641-4b55-904b-22b681419f52-kube-api-access-62h2x\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.457078 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.456958 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.457256 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.457118 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fdbacdd2-1641-4b55-904b-22b681419f52-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.457366 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.457342 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fdbacdd2-1641-4b55-904b-22b681419f52-crio-socket\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.458006 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.457595 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fdbacdd2-1641-4b55-904b-22b681419f52-data-volume\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.458006 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:15.457625 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:15.458006 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:15.457695 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls podName:fdbacdd2-1641-4b55-904b-22b681419f52 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.957677162 +0000 UTC m=+51.468204340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls") pod "insights-runtime-extractor-mpcwd" (UID: "fdbacdd2-1641-4b55-904b-22b681419f52") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:15.458355 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.458309 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fdbacdd2-1641-4b55-904b-22b681419f52-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.477739 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.477665 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62h2x\" (UniqueName: \"kubernetes.io/projected/fdbacdd2-1641-4b55-904b-22b681419f52-kube-api-access-62h2x\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.962707 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:15.962666 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:15.962898 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:15.962829 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:15.962945 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:15.962900 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls podName:fdbacdd2-1641-4b55-904b-22b681419f52 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:16.962885102 +0000 UTC m=+52.473412280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls") pod "insights-runtime-extractor-mpcwd" (UID: "fdbacdd2-1641-4b55-904b-22b681419f52") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:16.021772 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:16.021743 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fq7vn_e6d48c3e-9be8-4750-ab9c-18ee060b61dd/dns-node-resolver/0.log" Apr 24 22:30:16.973863 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:16.973816 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:16.974261 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:16.973926 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:16.974261 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:16.974016 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls podName:fdbacdd2-1641-4b55-904b-22b681419f52 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.974001171 +0000 UTC m=+54.484528472 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls") pod "insights-runtime-extractor-mpcwd" (UID: "fdbacdd2-1641-4b55-904b-22b681419f52") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:17.226694 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:17.226608 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sw5hp_fb372715-ce4d-476e-881a-eedf339ac388/node-ca/0.log" Apr 24 22:30:18.429559 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:18.429100 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k6bhr" event={"ID":"ae28b2f3-d733-438a-8a82-1ea82ac5ac63","Type":"ContainerStarted","Data":"88b76cc3c8b8a34114ac6a764f68dfa20d7a05b1737002dad77b2783dabdcb4e"} Apr 24 22:30:18.472438 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:18.472386 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-k6bhr" podStartSLOduration=33.81663176 podStartE2EDuration="38.472371991s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:30:12.939284481 +0000 UTC m=+48.449811665" lastFinishedPulling="2026-04-24 22:30:17.595024709 +0000 UTC m=+53.105551896" observedRunningTime="2026-04-24 22:30:18.471483644 +0000 UTC m=+53.982010845" watchObservedRunningTime="2026-04-24 22:30:18.472371991 +0000 UTC m=+53.982899191" Apr 24 22:30:18.681157 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:18.681079 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:30:18.681157 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:18.681131 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:30:18.681539 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:18.681524 2582 scope.go:117] "RemoveContainer" containerID="fee5dd84ac009c87a0ec4f2c1d942d57a9026f28820ba977ec5fdc96a4df030a" Apr 24 22:30:18.681745 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:18.681723 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tbw4p_openshift-console-operator(3517afbe-450e-4a99-a668-a2cc8ca01cbc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" podUID="3517afbe-450e-4a99-a668-a2cc8ca01cbc" Apr 24 22:30:18.992132 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:18.992051 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:18.992288 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:18.992209 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:18.992288 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:18.992286 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls podName:fdbacdd2-1641-4b55-904b-22b681419f52 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:22.992265777 +0000 UTC m=+58.502792967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls") pod "insights-runtime-extractor-mpcwd" (UID: "fdbacdd2-1641-4b55-904b-22b681419f52") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:19.433355 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:19.433318 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" event={"ID":"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb","Type":"ContainerStarted","Data":"7bd5a49978b7878c084b39f2026e874effa466e034338c5671f1df5f42cc8481"} Apr 24 22:30:19.433355 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:19.433356 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" event={"ID":"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb","Type":"ContainerStarted","Data":"30787ba5ce0d54c2bc68c9e4a4f8fcaf6ab6492ceb07f614682a7b7ceed8c7ed"} Apr 24 22:30:19.466033 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:19.465983 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" podStartSLOduration=27.454078364 podStartE2EDuration="46.46596747s" podCreationTimestamp="2026-04-24 22:29:33 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.201348111 +0000 UTC m=+35.711875293" lastFinishedPulling="2026-04-24 22:30:19.213237218 +0000 UTC m=+54.723764399" observedRunningTime="2026-04-24 22:30:19.465347827 +0000 UTC m=+54.975875027" watchObservedRunningTime="2026-04-24 22:30:19.46596747 +0000 UTC m=+54.976494735" Apr 24 22:30:23.026901 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:23.026859 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:23.027250 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:23.027011 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:23.027250 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:23.027083 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls podName:fdbacdd2-1641-4b55-904b-22b681419f52 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:31.027067113 +0000 UTC m=+66.537594293 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls") pod "insights-runtime-extractor-mpcwd" (UID: "fdbacdd2-1641-4b55-904b-22b681419f52") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:24.244815 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:24.244783 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ntzt" Apr 24 22:30:30.294743 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.294702 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:30.294743 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.294750 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:30.295353 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.294786 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:30.295405 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.295367 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a46814a2-9573-4978-a715-70fdad9204e4-service-ca-bundle\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:30.297448 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.297425 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a46814a2-9573-4978-a715-70fdad9204e4-metrics-certs\") pod \"router-default-74654b95d8-zp62l\" (UID: \"a46814a2-9573-4978-a715-70fdad9204e4\") " pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:30.297553 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.297445 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"image-registry-86b6cf9d64-bbh5q\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:30.396151 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.396112 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:30:30.398643 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.398622 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d82ca42-78a8-4968-9083-5d9f43035324-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lwc4g\" (UID: \"5d82ca42-78a8-4968-9083-5d9f43035324\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:30:30.444552 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.444522 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lr488\"" Apr 24 22:30:30.451962 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.451941 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:30.496792 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.496727 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:30:30.497933 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.496851 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:30.497933 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.496914 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:30:30.497933 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:30.496947 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:30.497933 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:30.497039 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls podName:ba1f6487-5042-4d0a-8bb2-4f385d224529 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:02.49702083 +0000 UTC m=+98.007548015 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59n7s" (UID: "ba1f6487-5042-4d0a-8bb2-4f385d224529") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:30.500535 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.500506 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4b92403-6523-4474-832b-2bb3cb7d7b9d-metrics-tls\") pod \"dns-default-6w7zw\" (UID: \"e4b92403-6523-4474-832b-2bb3cb7d7b9d\") " pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:30.501827 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.501750 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2xb78\"" Apr 24 22:30:30.502051 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.502031 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a180131-c839-45eb-9da2-6f9ffa71d641-cert\") pod \"ingress-canary-4mm5b\" (UID: \"3a180131-c839-45eb-9da2-6f9ffa71d641\") " pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:30:30.509389 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.509358 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:30.551424 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.551278 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-c2l8h\"" Apr 24 22:30:30.559124 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.559096 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" Apr 24 22:30:30.587796 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.587752 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86b6cf9d64-bbh5q"] Apr 24 22:30:30.589207 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:30.589178 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1f9a0b_38db_4713_969a_f7221408b685.slice/crio-6c71e7639ab265108ab1ce0d5387f72856b40745da3c42728f29692de52ee360 WatchSource:0}: Error finding container 6c71e7639ab265108ab1ce0d5387f72856b40745da3c42728f29692de52ee360: Status 404 returned error can't find the container with id 6c71e7639ab265108ab1ce0d5387f72856b40745da3c42728f29692de52ee360 Apr 24 22:30:30.598028 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.597867 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:30:30.598028 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:30.597992 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:30.598147 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:30:30.598063 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert podName:9f63adb0-5994-496d-880d-9d660e539622 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:02.598040699 +0000 UTC m=+98.108567890 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9b2dm" (UID: "9f63adb0-5994-496d-880d-9d660e539622") : secret "networking-console-plugin-cert" not found Apr 24 22:30:30.627619 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.627575 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qwslk\"" Apr 24 22:30:30.635519 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.635409 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:30.639878 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.639579 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x6wvt\"" Apr 24 22:30:30.647887 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.647850 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mm5b" Apr 24 22:30:30.656627 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.656496 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-74654b95d8-zp62l"] Apr 24 22:30:30.660457 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:30.660152 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda46814a2_9573_4978_a715_70fdad9204e4.slice/crio-783a098ccb690fdad805c72c29a5964a8922490ab84094a186496fe2ee6b54d6 WatchSource:0}: Error finding container 783a098ccb690fdad805c72c29a5964a8922490ab84094a186496fe2ee6b54d6: Status 404 returned error can't find the container with id 783a098ccb690fdad805c72c29a5964a8922490ab84094a186496fe2ee6b54d6 Apr 24 22:30:30.763331 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.763286 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g"] Apr 24 22:30:30.804792 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.804707 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:30:30.807654 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.807628 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:30.817306 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.817278 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6w7zw"] Apr 24 22:30:30.818538 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.818516 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c19cc309-d892-45ed-a3cd-43a98273bafb-metrics-certs\") pod \"network-metrics-daemon-44r7l\" (UID: \"c19cc309-d892-45ed-a3cd-43a98273bafb\") " pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:30:30.820313 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:30.820291 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b92403_6523_4474_832b_2bb3cb7d7b9d.slice/crio-99b595af53ca0842137f6b23fd0aef5af24a3c7b3c89d77e36b1593a1dabba4f WatchSource:0}: Error finding container 99b595af53ca0842137f6b23fd0aef5af24a3c7b3c89d77e36b1593a1dabba4f: Status 404 returned error can't find the container with id 99b595af53ca0842137f6b23fd0aef5af24a3c7b3c89d77e36b1593a1dabba4f Apr 24 22:30:30.838398 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.838371 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4mm5b"] Apr 24 22:30:30.840383 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:30.840356 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a180131_c839_45eb_9da2_6f9ffa71d641.slice/crio-5ef0023a3eb1f2814dbcc9a34c99e836410bb23bb7c4f6ca37f4f38152890c34 WatchSource:0}: Error finding container 5ef0023a3eb1f2814dbcc9a34c99e836410bb23bb7c4f6ca37f4f38152890c34: Status 404 returned error can't find the container with id 5ef0023a3eb1f2814dbcc9a34c99e836410bb23bb7c4f6ca37f4f38152890c34 Apr 24 22:30:30.914882 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.914849 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-74npb\"" Apr 24 22:30:30.922495 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:30.922466 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44r7l" Apr 24 22:30:31.063540 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.063516 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-44r7l"] Apr 24 22:30:31.065327 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:31.065298 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19cc309_d892_45ed_a3cd_43a98273bafb.slice/crio-01010d11b80cb95177c25f6ebd276784c4a5cbd4f25057fa1b13abf765e21704 WatchSource:0}: Error finding container 01010d11b80cb95177c25f6ebd276784c4a5cbd4f25057fa1b13abf765e21704: Status 404 returned error can't find the container with id 01010d11b80cb95177c25f6ebd276784c4a5cbd4f25057fa1b13abf765e21704 Apr 24 22:30:31.107442 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.107408 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:31.109792 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.109766 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fdbacdd2-1641-4b55-904b-22b681419f52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mpcwd\" (UID: \"fdbacdd2-1641-4b55-904b-22b681419f52\") " pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:31.170054 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.170023 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qbjm6\"" Apr 24 22:30:31.177821 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.177775 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mpcwd" Apr 24 22:30:31.336694 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.336588 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mpcwd"] Apr 24 22:30:31.341792 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:30:31.341752 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdbacdd2_1641_4b55_904b_22b681419f52.slice/crio-aa542914068d2fe73e91ec939138462ceb0a28a61eda43ba70907305d2977077 WatchSource:0}: Error finding container aa542914068d2fe73e91ec939138462ceb0a28a61eda43ba70907305d2977077: Status 404 returned error can't find the container with id aa542914068d2fe73e91ec939138462ceb0a28a61eda43ba70907305d2977077 Apr 24 22:30:31.489890 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.489840 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w7zw" event={"ID":"e4b92403-6523-4474-832b-2bb3cb7d7b9d","Type":"ContainerStarted","Data":"99b595af53ca0842137f6b23fd0aef5af24a3c7b3c89d77e36b1593a1dabba4f"} Apr 24 22:30:31.494339 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.493349 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" event={"ID":"0d1f9a0b-38db-4713-969a-f7221408b685","Type":"ContainerStarted","Data":"d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5"} Apr 24 22:30:31.494339 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.493389 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" event={"ID":"0d1f9a0b-38db-4713-969a-f7221408b685","Type":"ContainerStarted","Data":"6c71e7639ab265108ab1ce0d5387f72856b40745da3c42728f29692de52ee360"} Apr 24 22:30:31.494339 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.494309 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:31.496279 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.496205 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mpcwd" event={"ID":"fdbacdd2-1641-4b55-904b-22b681419f52","Type":"ContainerStarted","Data":"3843c47b50664336abbd215aade29b76ddc1705172771a6864eb95d8a7838b0d"} Apr 24 22:30:31.496279 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.496238 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mpcwd" event={"ID":"fdbacdd2-1641-4b55-904b-22b681419f52","Type":"ContainerStarted","Data":"aa542914068d2fe73e91ec939138462ceb0a28a61eda43ba70907305d2977077"} Apr 24 22:30:31.498995 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.498938 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-74654b95d8-zp62l" event={"ID":"a46814a2-9573-4978-a715-70fdad9204e4","Type":"ContainerStarted","Data":"47d9ebf59f19965fbb9aa1a4914f4d35d7931adb383378461c1135806d8f867d"} Apr 24 22:30:31.498995 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.498965 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-74654b95d8-zp62l" event={"ID":"a46814a2-9573-4978-a715-70fdad9204e4","Type":"ContainerStarted","Data":"783a098ccb690fdad805c72c29a5964a8922490ab84094a186496fe2ee6b54d6"} Apr 24 22:30:31.500739 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.500685 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4mm5b" event={"ID":"3a180131-c839-45eb-9da2-6f9ffa71d641","Type":"ContainerStarted","Data":"5ef0023a3eb1f2814dbcc9a34c99e836410bb23bb7c4f6ca37f4f38152890c34"} Apr 24 22:30:31.502331 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.502285 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" event={"ID":"5d82ca42-78a8-4968-9083-5d9f43035324","Type":"ContainerStarted","Data":"4ab38013cef13dce9654730e473ed8295fb14fdf6583879175c58b2c140b2d44"} Apr 24 22:30:31.504235 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.504184 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-44r7l" event={"ID":"c19cc309-d892-45ed-a3cd-43a98273bafb","Type":"ContainerStarted","Data":"01010d11b80cb95177c25f6ebd276784c4a5cbd4f25057fa1b13abf765e21704"} Apr 24 22:30:31.509893 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.509863 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:31.512855 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.512821 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:31.541208 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.540553 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" podStartSLOduration=66.540537573 podStartE2EDuration="1m6.540537573s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:31.539040137 +0000 UTC m=+67.049567375" watchObservedRunningTime="2026-04-24 22:30:31.540537573 +0000 UTC m=+67.051064770" Apr 24 22:30:31.569899 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:31.568653 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-74654b95d8-zp62l" podStartSLOduration=40.568632308 podStartE2EDuration="40.568632308s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:31.568227916 +0000 UTC m=+67.078755116" watchObservedRunningTime="2026-04-24 22:30:31.568632308 +0000 UTC m=+67.079159512" Apr 24 22:30:32.101069 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:32.100681 2582 scope.go:117] "RemoveContainer" containerID="fee5dd84ac009c87a0ec4f2c1d942d57a9026f28820ba977ec5fdc96a4df030a" Apr 24 22:30:32.509113 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:32.509033 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:32.510346 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:32.510324 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-74654b95d8-zp62l" Apr 24 22:30:34.520823 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.520765 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4mm5b" event={"ID":"3a180131-c839-45eb-9da2-6f9ffa71d641","Type":"ContainerStarted","Data":"85a50bcec78356cdfb2dc4a00dba3f5818b18c083a4486f127e8214cf9aac49d"} Apr 24 22:30:34.522729 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.522695 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" event={"ID":"5d82ca42-78a8-4968-9083-5d9f43035324","Type":"ContainerStarted","Data":"dc42aa58980fbe0e0711929826105a667cb3a7a36e7b533613df9ba2c560bf9a"} Apr 24 22:30:34.522892 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.522737 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" event={"ID":"5d82ca42-78a8-4968-9083-5d9f43035324","Type":"ContainerStarted","Data":"3bcb809d9284ca46f3a8df6eb8e62a8a9abb9508c254e06beab94c91d1cb47a5"} Apr 24 22:30:34.524607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.524575 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-44r7l" event={"ID":"c19cc309-d892-45ed-a3cd-43a98273bafb","Type":"ContainerStarted","Data":"a85bed2a31c71f245b8b966736561f73e3655b7d0ff25faff11bf8636b5f0263"} Apr 24 22:30:34.524726 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.524610 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-44r7l" event={"ID":"c19cc309-d892-45ed-a3cd-43a98273bafb","Type":"ContainerStarted","Data":"cd6abf5c6419f68a8a68b4d8157e121924ae64e5651f710285e558d0c272deed"} Apr 24 22:30:34.526402 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.526381 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:30:34.526523 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.526468 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" event={"ID":"3517afbe-450e-4a99-a668-a2cc8ca01cbc","Type":"ContainerStarted","Data":"d1ece15eb9a1cb5b7b0238b79069a05efa3adb68b7c5e520554935f3eeef8ed3"} Apr 24 22:30:34.526837 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.526778 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:30:34.528819 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.528784 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w7zw" event={"ID":"e4b92403-6523-4474-832b-2bb3cb7d7b9d","Type":"ContainerStarted","Data":"4093fc0dca35ee08055cdc323158d385515586a4b0737b84d37e4277377e68d6"} Apr 24 22:30:34.528962 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.528944 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6w7zw" event={"ID":"e4b92403-6523-4474-832b-2bb3cb7d7b9d","Type":"ContainerStarted","Data":"1dc387a532aff9a4cd3f923271b995000cb8a4e381fc31b93be53d1227a3a3d6"} Apr 24 22:30:34.529067 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.529054 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:34.530525 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.530503 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mpcwd" event={"ID":"fdbacdd2-1641-4b55-904b-22b681419f52","Type":"ContainerStarted","Data":"05b558789aabd4d997bb6650b7e436f87c30d24ab71b020d381a8d19097a4c77"} Apr 24 22:30:34.543417 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.543375 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4mm5b" podStartSLOduration=33.726877838 podStartE2EDuration="36.543362681s" podCreationTimestamp="2026-04-24 22:29:58 +0000 UTC" firstStartedPulling="2026-04-24 22:30:30.842478174 +0000 UTC m=+66.353005367" lastFinishedPulling="2026-04-24 22:30:33.65896302 +0000 UTC m=+69.169490210" observedRunningTime="2026-04-24 22:30:34.542085048 +0000 UTC m=+70.052612245" watchObservedRunningTime="2026-04-24 22:30:34.543362681 +0000 UTC m=+70.053889882" Apr 24 22:30:34.563125 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.563070 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-44r7l" podStartSLOduration=66.9713589 podStartE2EDuration="1m9.563051675s" podCreationTimestamp="2026-04-24 22:29:25 +0000 UTC" firstStartedPulling="2026-04-24 22:30:31.067291171 +0000 UTC m=+66.577818367" lastFinishedPulling="2026-04-24 22:30:33.658983946 +0000 UTC m=+69.169511142" observedRunningTime="2026-04-24 22:30:34.561040167 +0000 UTC m=+70.071567369" watchObservedRunningTime="2026-04-24 22:30:34.563051675 +0000 UTC m=+70.073578881" Apr 24 22:30:34.583839 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.583780 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" podStartSLOduration=31.086158647 podStartE2EDuration="43.583763435s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:30:00.207868417 +0000 UTC m=+35.718395610" lastFinishedPulling="2026-04-24 22:30:12.70547322 +0000 UTC m=+48.216000398" observedRunningTime="2026-04-24 22:30:34.58223518 +0000 UTC m=+70.092762390" watchObservedRunningTime="2026-04-24 22:30:34.583763435 +0000 UTC m=+70.094290634" Apr 24 22:30:34.614957 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.614899 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6w7zw" podStartSLOduration=33.778856722 podStartE2EDuration="36.61488431s" podCreationTimestamp="2026-04-24 22:29:58 +0000 UTC" firstStartedPulling="2026-04-24 22:30:30.82255804 +0000 UTC m=+66.333085220" lastFinishedPulling="2026-04-24 22:30:33.658585625 +0000 UTC m=+69.169112808" observedRunningTime="2026-04-24 22:30:34.614026694 +0000 UTC m=+70.124553904" watchObservedRunningTime="2026-04-24 22:30:34.61488431 +0000 UTC m=+70.125411511" Apr 24 22:30:34.645228 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.645177 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lwc4g" podStartSLOduration=40.810465647 podStartE2EDuration="43.645161805s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:30:30.823879009 +0000 UTC m=+66.334406187" lastFinishedPulling="2026-04-24 22:30:33.658575163 +0000 UTC m=+69.169102345" observedRunningTime="2026-04-24 22:30:34.644204232 +0000 UTC m=+70.154731433" watchObservedRunningTime="2026-04-24 22:30:34.645161805 +0000 UTC m=+70.155689004" Apr 24 22:30:34.817266 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:34.817170 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-tbw4p" Apr 24 22:30:35.535718 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:35.535628 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mpcwd" event={"ID":"fdbacdd2-1641-4b55-904b-22b681419f52","Type":"ContainerStarted","Data":"e560c9669e881d9501e0a6f5a52fbb1515a250fba40a2b95ab4ff6010ad49104"} Apr 24 22:30:44.422601 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:44.422458 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-k24sc" Apr 24 22:30:44.455641 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:44.455580 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mpcwd" podStartSLOduration=25.691212813 podStartE2EDuration="29.455561462s" podCreationTimestamp="2026-04-24 22:30:15 +0000 UTC" firstStartedPulling="2026-04-24 22:30:31.431404441 +0000 UTC m=+66.941931626" lastFinishedPulling="2026-04-24 22:30:35.195753087 +0000 UTC m=+70.706280275" observedRunningTime="2026-04-24 22:30:35.563988651 +0000 UTC m=+71.074515872" watchObservedRunningTime="2026-04-24 22:30:44.455561462 +0000 UTC m=+79.966088664" Apr 24 22:30:44.538571 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:44.538542 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6w7zw" Apr 24 22:30:50.456676 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:50.456642 2582 patch_prober.go:28] interesting pod/image-registry-86b6cf9d64-bbh5q container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:30:50.457112 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:50.456696 2582 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" podUID="0d1f9a0b-38db-4713-969a-f7221408b685" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:30:53.515736 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:53.515707 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:30:58.970693 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:30:58.970657 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86b6cf9d64-bbh5q"] Apr 24 22:31:02.585925 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.585852 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:31:02.588309 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.588284 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba1f6487-5042-4d0a-8bb2-4f385d224529-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59n7s\" (UID: \"ba1f6487-5042-4d0a-8bb2-4f385d224529\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:31:02.686418 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.686380 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:31:02.689356 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.689332 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9f63adb0-5994-496d-880d-9d660e539622-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9b2dm\" (UID: \"9f63adb0-5994-496d-880d-9d660e539622\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:31:02.707387 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.707362 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-m58ng\"" Apr 24 22:31:02.715592 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.715570 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" Apr 24 22:31:02.841587 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.841533 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s"] Apr 24 22:31:02.845710 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:31:02.845681 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba1f6487_5042_4d0a_8bb2_4f385d224529.slice/crio-0d8463eb1c2c0e8680f6acace04b199ff64d49e8d0c8aa1367f0dc59479932bf WatchSource:0}: Error finding container 0d8463eb1c2c0e8680f6acace04b199ff64d49e8d0c8aa1367f0dc59479932bf: Status 404 returned error can't find the container with id 0d8463eb1c2c0e8680f6acace04b199ff64d49e8d0c8aa1367f0dc59479932bf Apr 24 22:31:02.847159 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.847139 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-kp2cb\"" Apr 24 22:31:02.854424 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.854404 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" Apr 24 22:31:02.980760 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:02.980732 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm"] Apr 24 22:31:02.983288 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:31:02.983259 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f63adb0_5994_496d_880d_9d660e539622.slice/crio-c186447a0f38b14150084a522267badab9690143ebe3318ce58569e729ed4eb1 WatchSource:0}: Error finding container c186447a0f38b14150084a522267badab9690143ebe3318ce58569e729ed4eb1: Status 404 returned error can't find the container with id c186447a0f38b14150084a522267badab9690143ebe3318ce58569e729ed4eb1 Apr 24 22:31:03.614911 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:03.614867 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" event={"ID":"9f63adb0-5994-496d-880d-9d660e539622","Type":"ContainerStarted","Data":"c186447a0f38b14150084a522267badab9690143ebe3318ce58569e729ed4eb1"} Apr 24 22:31:03.616109 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:03.616080 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" event={"ID":"ba1f6487-5042-4d0a-8bb2-4f385d224529","Type":"ContainerStarted","Data":"0d8463eb1c2c0e8680f6acace04b199ff64d49e8d0c8aa1367f0dc59479932bf"} Apr 24 22:31:04.623579 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:04.623538 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" event={"ID":"ba1f6487-5042-4d0a-8bb2-4f385d224529","Type":"ContainerStarted","Data":"451d0fd24b08771ae842a068176e816e03a896fc5365339aab94571c541b853d"} Apr 24 22:31:04.624887 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:04.624858 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" event={"ID":"9f63adb0-5994-496d-880d-9d660e539622","Type":"ContainerStarted","Data":"d74df1f345081f47714ab8a0c2b063105cd7db8461c50e216ce462632da063ca"} Apr 24 22:31:04.646631 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:04.646387 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59n7s" podStartSLOduration=72.067860809 podStartE2EDuration="1m13.646369489s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:31:02.847918256 +0000 UTC m=+98.358445434" lastFinishedPulling="2026-04-24 22:31:04.426426924 +0000 UTC m=+99.936954114" observedRunningTime="2026-04-24 22:31:04.646342766 +0000 UTC m=+100.156869966" watchObservedRunningTime="2026-04-24 22:31:04.646369489 +0000 UTC m=+100.156896687" Apr 24 22:31:04.672631 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:04.672522 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9b2dm" podStartSLOduration=66.235316419 podStartE2EDuration="1m7.672504606s" podCreationTimestamp="2026-04-24 22:29:57 +0000 UTC" firstStartedPulling="2026-04-24 22:31:02.985241981 +0000 UTC m=+98.495769160" lastFinishedPulling="2026-04-24 22:31:04.422430154 +0000 UTC m=+99.932957347" observedRunningTime="2026-04-24 22:31:04.672010638 +0000 UTC m=+100.182537839" watchObservedRunningTime="2026-04-24 22:31:04.672504606 +0000 UTC m=+100.183031808" Apr 24 22:31:13.548146 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.548077 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tmp6m"] Apr 24 22:31:13.551456 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.551439 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.554991 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.554971 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:31:13.555279 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.555258 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:31:13.555398 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.555257 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:31:13.555455 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.555421 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:31:13.556607 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.556590 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8plhd\"" Apr 24 22:31:13.678603 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678574 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.678603 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678610 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-metrics-client-ca\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.678794 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678644 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-sys\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.678794 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678669 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74p72\" (UniqueName: \"kubernetes.io/projected/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-kube-api-access-74p72\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.678794 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678713 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-textfile\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.678794 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678783 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-tls\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.678937 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678829 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-wtmp\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.678937 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678848 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-root\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.678937 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.678871 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-accelerators-collector-config\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779481 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779448 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779653 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779490 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-metrics-client-ca\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779653 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779529 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-sys\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779653 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779587 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-sys\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779653 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779625 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74p72\" (UniqueName: \"kubernetes.io/projected/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-kube-api-access-74p72\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779653 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779650 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-textfile\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779936 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779709 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-tls\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779936 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779733 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-wtmp\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779936 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779757 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-root\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.779936 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779789 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-accelerators-collector-config\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.780206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779931 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-wtmp\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.780206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.779990 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-root\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.780206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.780023 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-textfile\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.780206 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.780183 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-metrics-client-ca\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.780522 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.780501 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-accelerators-collector-config\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.782142 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.782118 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-tls\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.782243 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.782166 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.787932 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.787905 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74p72\" (UniqueName: \"kubernetes.io/projected/d61e0701-390f-4cc5-a5ac-b8ab5bd08a88-kube-api-access-74p72\") pod \"node-exporter-tmp6m\" (UID: \"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88\") " pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.860998 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:13.860968 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tmp6m" Apr 24 22:31:13.870159 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:31:13.870132 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61e0701_390f_4cc5_a5ac_b8ab5bd08a88.slice/crio-ea3e68994fbb8f22cfe5fe1ac1f318edb5099e9ffb3d610e4d980b30040cfe5a WatchSource:0}: Error finding container ea3e68994fbb8f22cfe5fe1ac1f318edb5099e9ffb3d610e4d980b30040cfe5a: Status 404 returned error can't find the container with id ea3e68994fbb8f22cfe5fe1ac1f318edb5099e9ffb3d610e4d980b30040cfe5a Apr 24 22:31:14.652463 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:14.652433 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tmp6m" event={"ID":"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88","Type":"ContainerStarted","Data":"ea3e68994fbb8f22cfe5fe1ac1f318edb5099e9ffb3d610e4d980b30040cfe5a"} Apr 24 22:31:15.656941 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:15.656895 2582 generic.go:358] "Generic (PLEG): container finished" podID="d61e0701-390f-4cc5-a5ac-b8ab5bd08a88" containerID="507c20f0910f6a0b33b24e6144175f45269a7d2c1d739e55b49ec62c53b64860" exitCode=0 Apr 24 22:31:15.657395 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:15.656964 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tmp6m" event={"ID":"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88","Type":"ContainerDied","Data":"507c20f0910f6a0b33b24e6144175f45269a7d2c1d739e55b49ec62c53b64860"} Apr 24 22:31:16.661901 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:16.661856 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tmp6m" event={"ID":"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88","Type":"ContainerStarted","Data":"ac3f04535cfaf1ebd3973956763be999466dc990e63c4a9fb416736cf4fe797d"} Apr 24 22:31:16.661901 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:16.661901 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tmp6m" event={"ID":"d61e0701-390f-4cc5-a5ac-b8ab5bd08a88","Type":"ContainerStarted","Data":"b6f377f94bcfac5c5397a576c627ee58aebe0cc61e64302cd0139bd013ae2a8f"} Apr 24 22:31:16.685879 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:16.685798 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tmp6m" podStartSLOduration=2.984603053 podStartE2EDuration="3.685778801s" podCreationTimestamp="2026-04-24 22:31:13 +0000 UTC" firstStartedPulling="2026-04-24 22:31:13.871709796 +0000 UTC m=+109.382236975" lastFinishedPulling="2026-04-24 22:31:14.572885532 +0000 UTC m=+110.083412723" observedRunningTime="2026-04-24 22:31:16.683495929 +0000 UTC m=+112.194023154" watchObservedRunningTime="2026-04-24 22:31:16.685778801 +0000 UTC m=+112.196306002" Apr 24 22:31:23.990507 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:23.990456 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" podUID="0d1f9a0b-38db-4713-969a-f7221408b685" containerName="registry" containerID="cri-o://d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5" gracePeriod=30 Apr 24 22:31:24.221624 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.221601 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:31:24.371307 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371277 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-registry-certificates\") pod \"0d1f9a0b-38db-4713-969a-f7221408b685\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " Apr 24 22:31:24.371472 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371332 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-installation-pull-secrets\") pod \"0d1f9a0b-38db-4713-969a-f7221408b685\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " Apr 24 22:31:24.371472 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371352 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwj74\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-kube-api-access-fwj74\") pod \"0d1f9a0b-38db-4713-969a-f7221408b685\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " Apr 24 22:31:24.371472 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371378 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") pod \"0d1f9a0b-38db-4713-969a-f7221408b685\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " Apr 24 22:31:24.371618 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371511 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d1f9a0b-38db-4713-969a-f7221408b685-ca-trust-extracted\") pod \"0d1f9a0b-38db-4713-969a-f7221408b685\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " Apr 24 22:31:24.371618 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371553 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-bound-sa-token\") pod \"0d1f9a0b-38db-4713-969a-f7221408b685\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " Apr 24 22:31:24.371618 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371596 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-trusted-ca\") pod \"0d1f9a0b-38db-4713-969a-f7221408b685\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " Apr 24 22:31:24.371765 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371660 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-image-registry-private-configuration\") pod \"0d1f9a0b-38db-4713-969a-f7221408b685\" (UID: \"0d1f9a0b-38db-4713-969a-f7221408b685\") " Apr 24 22:31:24.371931 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.371892 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0d1f9a0b-38db-4713-969a-f7221408b685" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:24.372525 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.372490 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0d1f9a0b-38db-4713-969a-f7221408b685" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:24.374177 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.374148 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-kube-api-access-fwj74" (OuterVolumeSpecName: "kube-api-access-fwj74") pod "0d1f9a0b-38db-4713-969a-f7221408b685" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685"). InnerVolumeSpecName "kube-api-access-fwj74". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:24.374312 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.374279 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0d1f9a0b-38db-4713-969a-f7221408b685" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:24.374485 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.374467 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0d1f9a0b-38db-4713-969a-f7221408b685" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:24.374541 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.374478 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0d1f9a0b-38db-4713-969a-f7221408b685" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:24.374655 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.374637 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0d1f9a0b-38db-4713-969a-f7221408b685" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:24.379634 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.379601 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d1f9a0b-38db-4713-969a-f7221408b685-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0d1f9a0b-38db-4713-969a-f7221408b685" (UID: "0d1f9a0b-38db-4713-969a-f7221408b685"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:31:24.473201 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.473163 2582 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-bound-sa-token\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:31:24.473201 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.473193 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-trusted-ca\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:31:24.473201 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.473205 2582 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-image-registry-private-configuration\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:31:24.473440 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.473216 2582 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d1f9a0b-38db-4713-969a-f7221408b685-registry-certificates\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:31:24.473440 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.473225 2582 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d1f9a0b-38db-4713-969a-f7221408b685-installation-pull-secrets\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:31:24.473440 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.473235 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fwj74\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-kube-api-access-fwj74\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:31:24.473440 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.473243 2582 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d1f9a0b-38db-4713-969a-f7221408b685-registry-tls\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:31:24.473440 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.473252 2582 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d1f9a0b-38db-4713-969a-f7221408b685-ca-trust-extracted\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:31:24.685541 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.685443 2582 generic.go:358] "Generic (PLEG): container finished" podID="0d1f9a0b-38db-4713-969a-f7221408b685" containerID="d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5" exitCode=0 Apr 24 22:31:24.685541 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.685507 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" Apr 24 22:31:24.685719 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.685534 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" event={"ID":"0d1f9a0b-38db-4713-969a-f7221408b685","Type":"ContainerDied","Data":"d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5"} Apr 24 22:31:24.685719 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.685580 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86b6cf9d64-bbh5q" event={"ID":"0d1f9a0b-38db-4713-969a-f7221408b685","Type":"ContainerDied","Data":"6c71e7639ab265108ab1ce0d5387f72856b40745da3c42728f29692de52ee360"} Apr 24 22:31:24.685719 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.685596 2582 scope.go:117] "RemoveContainer" containerID="d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5" Apr 24 22:31:24.694089 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.694066 2582 scope.go:117] "RemoveContainer" containerID="d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5" Apr 24 22:31:24.694324 ip-10-0-133-73 kubenswrapper[2582]: E0424 22:31:24.694306 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5\": container with ID starting with d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5 not found: ID does not exist" containerID="d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5" Apr 24 22:31:24.694388 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.694332 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5"} err="failed to get container status \"d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5\": rpc error: code = NotFound desc = could not find container \"d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5\": container with ID starting with d6e1b51a66b49f56e9669c4005d88217e0398549e18c55357f11c50400084bf5 not found: ID does not exist" Apr 24 22:31:24.705628 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.705604 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86b6cf9d64-bbh5q"] Apr 24 22:31:24.711242 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:24.711218 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-86b6cf9d64-bbh5q"] Apr 24 22:31:25.103198 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:25.103162 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d1f9a0b-38db-4713-969a-f7221408b685" path="/var/lib/kubelet/pods/0d1f9a0b-38db-4713-969a-f7221408b685/volumes" Apr 24 22:31:38.722441 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:38.722400 2582 generic.go:358] "Generic (PLEG): container finished" podID="9804071e-5980-4f1d-95ce-ff7b5002d9d9" containerID="cc9e1ae6e2e89d0c811a4322f82371ec931f891d1137c11f5e00179763964f99" exitCode=0 Apr 24 22:31:38.722924 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:38.722472 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" event={"ID":"9804071e-5980-4f1d-95ce-ff7b5002d9d9","Type":"ContainerDied","Data":"cc9e1ae6e2e89d0c811a4322f82371ec931f891d1137c11f5e00179763964f99"} Apr 24 22:31:38.722924 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:38.722849 2582 scope.go:117] "RemoveContainer" containerID="cc9e1ae6e2e89d0c811a4322f82371ec931f891d1137c11f5e00179763964f99" Apr 24 22:31:38.723931 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:38.723908 2582 generic.go:358] "Generic (PLEG): container finished" podID="de154e2a-4cec-4799-b4c9-72ed4e2d85c4" containerID="4fffa5408c1a8371d219e5cb75330c0e8373f1517c27eab431aa846c354165ac" exitCode=0 Apr 24 22:31:38.724026 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:38.723950 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" event={"ID":"de154e2a-4cec-4799-b4c9-72ed4e2d85c4","Type":"ContainerDied","Data":"4fffa5408c1a8371d219e5cb75330c0e8373f1517c27eab431aa846c354165ac"} Apr 24 22:31:38.724196 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:38.724182 2582 scope.go:117] "RemoveContainer" containerID="4fffa5408c1a8371d219e5cb75330c0e8373f1517c27eab431aa846c354165ac" Apr 24 22:31:38.921929 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:38.921887 2582 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" podUID="2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:31:39.728957 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:39.728913 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tpzdt" event={"ID":"9804071e-5980-4f1d-95ce-ff7b5002d9d9","Type":"ContainerStarted","Data":"a2bee9480f13cc2a091ddfde2685d633ee3b2428ebcb44a276a30a1960a08bbf"} Apr 24 22:31:39.730411 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:39.730383 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9c9zw" event={"ID":"de154e2a-4cec-4799-b4c9-72ed4e2d85c4","Type":"ContainerStarted","Data":"7e7a3e313c6b81b2c7cad8a934f4433601c0fac2558c23e37f9d70b7bf147b9a"} Apr 24 22:31:48.758057 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:48.758021 2582 generic.go:358] "Generic (PLEG): container finished" podID="23d7baa2-f8ae-4472-abcb-860f16acd197" containerID="804f49197559c4922c77561e581565c0e020822a82cf569d190954558e89321c" exitCode=0 Apr 24 22:31:48.758486 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:48.758082 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-p6xqh" event={"ID":"23d7baa2-f8ae-4472-abcb-860f16acd197","Type":"ContainerDied","Data":"804f49197559c4922c77561e581565c0e020822a82cf569d190954558e89321c"} Apr 24 22:31:48.758486 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:48.758447 2582 scope.go:117] "RemoveContainer" containerID="804f49197559c4922c77561e581565c0e020822a82cf569d190954558e89321c" Apr 24 22:31:48.922170 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:48.922128 2582 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" podUID="2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:31:49.761949 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:49.761910 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-p6xqh" event={"ID":"23d7baa2-f8ae-4472-abcb-860f16acd197","Type":"ContainerStarted","Data":"a8f2ec12df2aff61c4872ced2ffb72736cc5616b2223baf361771439b39eef84"} Apr 24 22:31:58.921985 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:58.921940 2582 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" podUID="2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:31:58.922426 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:58.922012 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" Apr 24 22:31:58.922520 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:58.922499 2582 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"7bd5a49978b7878c084b39f2026e874effa466e034338c5671f1df5f42cc8481"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 22:31:58.922556 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:58.922543 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" podUID="2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb" containerName="service-proxy" containerID="cri-o://7bd5a49978b7878c084b39f2026e874effa466e034338c5671f1df5f42cc8481" gracePeriod=30 Apr 24 22:31:59.792788 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:59.792753 2582 generic.go:358] "Generic (PLEG): container finished" podID="2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb" containerID="7bd5a49978b7878c084b39f2026e874effa466e034338c5671f1df5f42cc8481" exitCode=2 Apr 24 22:31:59.792978 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:59.792845 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" event={"ID":"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb","Type":"ContainerDied","Data":"7bd5a49978b7878c084b39f2026e874effa466e034338c5671f1df5f42cc8481"} Apr 24 22:31:59.792978 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:31:59.792890 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7d55577796-jwfxd" event={"ID":"2dde0bf2-7411-47ba-a4fd-5ac9e3c8eeeb","Type":"ContainerStarted","Data":"25d30e2216a53cff54745c8d43cde5274fbf12a2361667109d1045bcb563f50c"} Apr 24 22:34:25.011663 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:34:25.011634 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:34:25.012258 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:34:25.011797 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:34:25.029282 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:34:25.029258 2582 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:39:25.040636 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:39:25.040604 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:39:25.042491 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:39:25.042457 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:44:25.061015 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:44:25.060922 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:44:25.062925 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:44:25.062903 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:49:25.079253 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:49:25.079215 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:49:25.084701 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:49:25.084673 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:50:42.407762 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:42.407689 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-k6bhr_ae28b2f3-d733-438a-8a82-1ea82ac5ac63/global-pull-secret-syncer/0.log" Apr 24 22:50:42.507728 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:42.507691 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j5w9d_7502ad9c-0942-4f13-92a5-5c98853da696/konnectivity-agent/0.log" Apr 24 22:50:42.581301 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:42.581270 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-73.ec2.internal_d74e5b7f3ca859862ed6413284694748/haproxy/0.log" Apr 24 22:50:46.275787 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:46.275753 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59n7s_ba1f6487-5042-4d0a-8bb2-4f385d224529/cluster-monitoring-operator/0.log" Apr 24 22:50:46.577734 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:46.577642 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tmp6m_d61e0701-390f-4cc5-a5ac-b8ab5bd08a88/node-exporter/0.log" Apr 24 22:50:46.596066 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:46.596033 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tmp6m_d61e0701-390f-4cc5-a5ac-b8ab5bd08a88/kube-rbac-proxy/0.log" Apr 24 22:50:46.618579 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:46.618554 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tmp6m_d61e0701-390f-4cc5-a5ac-b8ab5bd08a88/init-textfile/0.log" Apr 24 22:50:48.427127 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:48.427096 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-9b2dm_9f63adb0-5994-496d-880d-9d660e539622/networking-console-plugin/0.log" Apr 24 22:50:48.823050 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:48.822949 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/1.log" Apr 24 22:50:48.826900 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:48.826880 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tbw4p_3517afbe-450e-4a99-a668-a2cc8ca01cbc/console-operator/2.log" Apr 24 22:50:49.354318 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.354288 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4"] Apr 24 22:50:49.354583 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.354571 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d1f9a0b-38db-4713-969a-f7221408b685" containerName="registry" Apr 24 22:50:49.354625 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.354585 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1f9a0b-38db-4713-969a-f7221408b685" containerName="registry" Apr 24 22:50:49.354658 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.354640 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d1f9a0b-38db-4713-969a-f7221408b685" containerName="registry" Apr 24 22:50:49.357515 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.357498 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.360007 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.359981 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9lwj\"/\"openshift-service-ca.crt\"" Apr 24 22:50:49.361074 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.361055 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9lwj\"/\"kube-root-ca.crt\"" Apr 24 22:50:49.361156 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.361055 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h9lwj\"/\"default-dockercfg-nh675\"" Apr 24 22:50:49.363085 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.363049 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4"] Apr 24 22:50:49.424669 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.424631 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-podres\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.424904 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.424682 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-sys\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.424904 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.424701 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-proc\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.424904 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.424792 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-lib-modules\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.424904 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.424895 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdpr\" (UniqueName: \"kubernetes.io/projected/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-kube-api-access-lkdpr\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526037 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526005 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdpr\" (UniqueName: \"kubernetes.io/projected/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-kube-api-access-lkdpr\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526046 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-podres\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526100 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-sys\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526129 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-proc\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526165 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-lib-modules\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526236 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-sys\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526250 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-proc\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526281 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-podres\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.526453 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.526307 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-lib-modules\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.533596 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.533557 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdpr\" (UniqueName: \"kubernetes.io/projected/b43811ef-ed39-498e-a3b4-0e543dc9fb6e-kube-api-access-lkdpr\") pod \"perf-node-gather-daemonset-wjzv4\" (UID: \"b43811ef-ed39-498e-a3b4-0e543dc9fb6e\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.614587 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.614507 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dnmh6_b95e6492-0f90-4364-8816-060a9df92b34/volume-data-source-validator/0.log" Apr 24 22:50:49.668086 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.668050 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.789407 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.789279 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4"] Apr 24 22:50:49.792218 ip-10-0-133-73 kubenswrapper[2582]: W0424 22:50:49.792181 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb43811ef_ed39_498e_a3b4_0e543dc9fb6e.slice/crio-e0ba8a193a13619bbd178c193d20fb37a77323f607a7153de7a24ba48fd02e9b WatchSource:0}: Error finding container e0ba8a193a13619bbd178c193d20fb37a77323f607a7153de7a24ba48fd02e9b: Status 404 returned error can't find the container with id e0ba8a193a13619bbd178c193d20fb37a77323f607a7153de7a24ba48fd02e9b Apr 24 22:50:49.793870 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.793854 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:50:49.952086 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.952051 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" event={"ID":"b43811ef-ed39-498e-a3b4-0e543dc9fb6e","Type":"ContainerStarted","Data":"876942e5c593d4a3a2715334c5394f121b0f0676355570fa56f1b0069b384295"} Apr 24 22:50:49.952188 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.952093 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" event={"ID":"b43811ef-ed39-498e-a3b4-0e543dc9fb6e","Type":"ContainerStarted","Data":"e0ba8a193a13619bbd178c193d20fb37a77323f607a7153de7a24ba48fd02e9b"} Apr 24 22:50:49.952239 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.952192 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:49.968245 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:49.968207 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" podStartSLOduration=0.968194894 podStartE2EDuration="968.194894ms" podCreationTimestamp="2026-04-24 22:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:50:49.966217355 +0000 UTC m=+1285.476744555" watchObservedRunningTime="2026-04-24 22:50:49.968194894 +0000 UTC m=+1285.478722090" Apr 24 22:50:50.231285 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:50.231207 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6w7zw_e4b92403-6523-4474-832b-2bb3cb7d7b9d/dns/0.log" Apr 24 22:50:50.251744 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:50.251721 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6w7zw_e4b92403-6523-4474-832b-2bb3cb7d7b9d/kube-rbac-proxy/0.log" Apr 24 22:50:50.371418 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:50.371390 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fq7vn_e6d48c3e-9be8-4750-ab9c-18ee060b61dd/dns-node-resolver/0.log" Apr 24 22:50:50.876546 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:50.876513 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sw5hp_fb372715-ce4d-476e-881a-eedf339ac388/node-ca/0.log" Apr 24 22:50:51.553051 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:51.553010 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-74654b95d8-zp62l_a46814a2-9573-4978-a715-70fdad9204e4/router/0.log" Apr 24 22:50:51.883767 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:51.883729 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4mm5b_3a180131-c839-45eb-9da2-6f9ffa71d641/serve-healthcheck-canary/0.log" Apr 24 22:50:52.292302 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:52.292215 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-p6xqh_23d7baa2-f8ae-4472-abcb-860f16acd197/insights-operator/0.log" Apr 24 22:50:52.292455 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:52.292323 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-p6xqh_23d7baa2-f8ae-4472-abcb-860f16acd197/insights-operator/1.log" Apr 24 22:50:52.381976 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:52.381945 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mpcwd_fdbacdd2-1641-4b55-904b-22b681419f52/kube-rbac-proxy/0.log" Apr 24 22:50:52.399612 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:52.399590 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mpcwd_fdbacdd2-1641-4b55-904b-22b681419f52/exporter/0.log" Apr 24 22:50:52.417990 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:52.417961 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mpcwd_fdbacdd2-1641-4b55-904b-22b681419f52/extractor/0.log" Apr 24 22:50:55.965395 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:55.965363 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-wjzv4" Apr 24 22:50:58.801827 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:58.801769 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9c9zw_de154e2a-4cec-4799-b4c9-72ed4e2d85c4/kube-storage-version-migrator-operator/1.log" Apr 24 22:50:58.802928 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:58.802911 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9c9zw_de154e2a-4cec-4799-b4c9-72ed4e2d85c4/kube-storage-version-migrator-operator/0.log" Apr 24 22:50:59.838337 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:59.838308 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n5rkg_acca45df-62e2-4002-8d37-055685b49029/kube-multus-additional-cni-plugins/0.log" Apr 24 22:50:59.856904 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:59.856873 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n5rkg_acca45df-62e2-4002-8d37-055685b49029/egress-router-binary-copy/0.log" Apr 24 22:50:59.875371 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:59.875334 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n5rkg_acca45df-62e2-4002-8d37-055685b49029/cni-plugins/0.log" Apr 24 22:50:59.894961 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:59.894931 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n5rkg_acca45df-62e2-4002-8d37-055685b49029/bond-cni-plugin/0.log" Apr 24 22:50:59.928773 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:59.928739 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n5rkg_acca45df-62e2-4002-8d37-055685b49029/routeoverride-cni/0.log" Apr 24 22:50:59.995916 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:50:59.995878 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n5rkg_acca45df-62e2-4002-8d37-055685b49029/whereabouts-cni-bincopy/0.log" Apr 24 22:51:00.037749 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:00.037718 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n5rkg_acca45df-62e2-4002-8d37-055685b49029/whereabouts-cni/0.log" Apr 24 22:51:00.244976 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:00.244946 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d8kzk_f7494b64-7d53-401b-8d5f-fbe58b9bf342/kube-multus/0.log" Apr 24 22:51:00.293484 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:00.293456 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-44r7l_c19cc309-d892-45ed-a3cd-43a98273bafb/network-metrics-daemon/0.log" Apr 24 22:51:00.316925 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:00.316890 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-44r7l_c19cc309-d892-45ed-a3cd-43a98273bafb/kube-rbac-proxy/0.log" Apr 24 22:51:01.124489 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:01.124455 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ntzt_59ba8b67-3c2d-436f-beec-62d19349a64d/ovn-controller/0.log" Apr 24 22:51:01.147114 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:01.147081 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ntzt_59ba8b67-3c2d-436f-beec-62d19349a64d/ovn-acl-logging/0.log" Apr 24 22:51:01.165798 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:01.165748 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ntzt_59ba8b67-3c2d-436f-beec-62d19349a64d/kube-rbac-proxy-node/0.log" Apr 24 22:51:01.186067 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:01.186037 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ntzt_59ba8b67-3c2d-436f-beec-62d19349a64d/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:51:01.204465 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:01.204439 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ntzt_59ba8b67-3c2d-436f-beec-62d19349a64d/northd/0.log" Apr 24 22:51:01.222956 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:01.222930 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ntzt_59ba8b67-3c2d-436f-beec-62d19349a64d/nbdb/0.log" Apr 24 22:51:01.245472 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:01.245446 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ntzt_59ba8b67-3c2d-436f-beec-62d19349a64d/sbdb/0.log" Apr 24 22:51:01.333682 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:01.333652 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ntzt_59ba8b67-3c2d-436f-beec-62d19349a64d/ovnkube-controller/0.log" Apr 24 22:51:02.815019 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:02.814992 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-nncrg_9b249279-0358-4efe-bcbd-16ecb96ece58/check-endpoints/0.log" Apr 24 22:51:02.878252 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:02.878221 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-k24sc_df25b403-ced4-4c31-9691-1da44a52f2a0/network-check-target-container/0.log" Apr 24 22:51:03.716445 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:03.716415 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-b6zc7_3d1e9be3-c77c-4335-aa84-6e1675c140a1/iptables-alerter/0.log" Apr 24 22:51:04.416555 ip-10-0-133-73 kubenswrapper[2582]: I0424 22:51:04.416518 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vcbqm_527aa17b-dc79-48d9-ab47-acf333ccde3f/tuned/0.log"