Apr 16 22:13:27.033030 ip-10-0-136-39 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:27.478320 ip-10-0-136-39 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:27.478320 ip-10-0-136-39 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:27.478320 ip-10-0-136-39 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:27.478320 ip-10-0-136-39 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:27.478320 ip-10-0-136-39 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:27.479771 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.479674 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:27.485071 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485057 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485072 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485077 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485081 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485084 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485087 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485090 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485092 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485095 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485102 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485105 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485108 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485111 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485114 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.485111 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485117 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485119 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485122 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485125 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485128 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485131 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485133 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485136 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485138 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485140 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485144 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485146 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485149 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485151 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485154 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485156 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485159 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485161 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485164 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485166 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.485574 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485169 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485172 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485174 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485177 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485179 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485182 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485184 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485187 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485227 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485278 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485281 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485284 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485287 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485289 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485294 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485297 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485300 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485322 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485325 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485328 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.486289 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485331 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485338 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.485340 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486130 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486145 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486152 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486158 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486164 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486169 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486173 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486178 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486183 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486188 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486193 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486197 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486207 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486212 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486216 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486221 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486225 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.486817 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486234 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486239 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486244 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486247 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486250 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486253 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486282 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486301 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486410 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486416 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486420 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486423 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486845 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486851 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486855 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486858 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486861 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486864 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486866 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486869 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.487293 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486872 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486874 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486877 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486879 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486882 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486884 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486887 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486890 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486892 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486896 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486901 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486905 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486908 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486911 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486914 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486918 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486921 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486924 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486927 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.487823 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486929 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486932 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486934 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486937 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486940 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486943 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486946 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486948 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486951 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486953 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486956 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486958 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486961 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486964 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486967 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486969 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486972 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486974 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486977 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486979 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.488294 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486981 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486984 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486986 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486989 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486991 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486993 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486996 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.486999 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487002 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487005 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487008 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487010 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487013 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487015 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487017 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487020 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487023 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487025 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487027 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.488836 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487030 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487033 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487036 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487039 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487041 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487044 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487046 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487049 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487052 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487054 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487057 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487059 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487061 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487064 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487067 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487069 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487071 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487074 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487076 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487079 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487151 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:27.489297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487158 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487166 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487171 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487176 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487179 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487183 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487188 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487191 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487195 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487198 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487201 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487205 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487209 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487212 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487215 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487218 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487220 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487223 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487228 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487231 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487234 2579 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487236 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487239 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487244 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:27.489816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487247 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487250 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487253 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487256 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487259 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487262 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487265 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487269 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487274 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487277 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487280 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487283 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487286 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487289 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487294 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487298 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487301 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487305 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487308 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487312 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487315 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487319 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487322 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487325 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487328 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:27.490399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487331 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487334 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487337 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487340 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487343 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487347 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487350 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487353 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487356 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487359 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487362 2579 flags.go:64] FLAG: --help="false" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487365 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487368 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487371 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487375 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487378 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487382 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487385 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487388 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487391 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487394 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487397 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487400 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487403 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:27.491015 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487405 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487408 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487411 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487414 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487416 2579 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487420 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487423 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487426 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487431 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487433 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487436 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487439 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487442 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487445 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487448 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487451 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487455 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487458 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487462 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487465 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487468 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487471 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487474 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487478 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487481 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:27.491585 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487484 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487492 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487495 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487498 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487501 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487504 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487510 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487513 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487516 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487519 2579 flags.go:64] FLAG: --port="10250" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487522 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487525 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-082cae8cdfe1af5e6" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487528 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487531 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487535 2579 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487540 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487543 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487547 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487550 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487552 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487556 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487559 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487563 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487565 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487568 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:27.492199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487571 2579 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487574 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487577 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487579 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487582 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487585 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487589 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487592 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487595 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487598 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487600 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487603 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487606 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487609 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487612 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487614 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487619 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487622 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487625 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487629 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487632 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487635 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487639 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487642 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487645 2579 flags.go:64] FLAG: --v="2" Apr 16 22:13:27.492820 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487649 2579 flags.go:64] FLAG: --version="false" Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487653 2579 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487657 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.487661 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487763 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487768 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487771 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487774 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487777 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487780 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487782 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487785 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487788 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487791 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487793 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487796 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487799 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487802 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487804 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487807 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.493417 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487809 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487812 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487814 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487817 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487819 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487822 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487826 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487830 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487833 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487840 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487843 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487846 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487859 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487863 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487866 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487869 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487872 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487875 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487877 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.493948 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487880 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487882 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487885 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487887 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487890 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487893 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487896 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487898 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487901 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487904 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487906 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487909 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487911 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487914 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487916 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487919 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487921 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487924 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487926 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487929 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.494422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487931 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487934 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487937 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487941 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487943 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487946 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487949 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487951 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487954 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487956 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487959 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487961 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487964 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487967 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487970 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487972 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487975 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487977 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487980 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487983 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.494928 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487986 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487988 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487991 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487993 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487996 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.487998 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.488001 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.488003 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.488005 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.488008 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.488010 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.488527 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.494609 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.494623 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494669 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494673 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.495422 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494676 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494679 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494682 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494685 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494688 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494691 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494694 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494697 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494701 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494705 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494708 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494711 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494714 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494716 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494719 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494738 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494744 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494748 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494751 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.495837 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494754 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494757 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494759 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494762 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494765 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494767 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494770 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494772 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494775 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494777 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494788 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494791 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494794 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494797 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494800 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494803 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494805 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494808 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494811 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494813 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494816 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.496301 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494819 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494821 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494824 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494826 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494829 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494832 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494834 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494837 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494839 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494842 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494844 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494847 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494849 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494852 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494854 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494857 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494859 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494862 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494864 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.496892 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494867 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494870 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494873 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494877 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494879 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494882 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494884 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494887 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494889 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494892 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494895 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494897 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494900 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494902 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494905 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494907 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494910 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494912 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494915 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494917 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.497356 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494920 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494923 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494925 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494927 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.494930 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.494935 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495034 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495039 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495043 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495045 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495049 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495052 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495055 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495057 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495061 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.497888 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495064 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495067 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495070 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495072 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495075 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495078 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495080 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495083 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495085 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495088 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495090 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495092 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495095 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495098 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495100 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495103 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495105 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495108 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495110 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495113 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495115 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.498264 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495117 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495120 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495124 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495127 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495130 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495133 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495135 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495139 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495142 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495145 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495149 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495152 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495155 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495159 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495162 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495165 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495168 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495170 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495173 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.498804 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495175 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495178 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495181 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495184 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495186 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495189 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495191 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495194 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495196 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495199 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495201 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495203 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495206 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495208 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495211 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495213 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495216 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495218 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495220 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.499258 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495223 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495226 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495228 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495230 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495233 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495236 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495239 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495241 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495244 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495247 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495249 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495252 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495255 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495257 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495260 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495262 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495265 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.499800 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:27.495267 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.500221 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.495272 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:27.500221 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.495977 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:27.500221 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.499233 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:27.500221 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.500092 2579 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:27.500221 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.500187 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:27.500348 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.500227 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:27.525408 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.525390 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:27.527703 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.527684 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:27.542575 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.542555 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:27.548166 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.548144 2579 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:27.550490 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.550476 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:27.553480 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.553462 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8f4742ef-5606-402f-99b7-186ab8dac7e5:/dev/nvme0n1p4 c358785c-1a8f-4fd9-a53e-d94851e9c91c:/dev/nvme0n1p3] Apr 16 22:13:27.553545 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.553479 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:27.554449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.554430 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:27.558702 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.558600 2579 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:27.557042633 +0000 UTC m=+0.402811537 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096479 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29085cb3fac64692c0baa3e661cfed SystemUUID:ec29085c-b3fa-c646-92c0-baa3e661cfed BootID:976440d6-f8eb-4308-b40a-1351aa58dcd1 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8f:4d:bb:09:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8f:4d:bb:09:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:8b:30:ca:b1:17 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:27.558702 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.558694 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:27.558840 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.558813 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:27.560550 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.560522 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:27.560680 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.560553 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-39.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:27.560776 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.560688 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:27.560776 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.560696 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:27.560776 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.560709 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:27.561618 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.561607 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:27.562390 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.562380 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:27.562488 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.562479 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:27.564827 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.564817 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:27.564864 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.564836 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:27.564864 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.564847 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:27.564864 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.564857 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:27.564956 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.564865 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:27.565871 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.565860 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:27.565920 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.565877 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:27.568574 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.568558 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:27.569800 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.569788 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:27.571066 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571055 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:27.571101 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571072 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:27.571101 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571078 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:27.571101 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571084 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:27.571101 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571090 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:27.571101 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571095 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:27.571243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571108 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:27.571243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571114 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:27.571243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571121 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:27.571243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571127 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:27.571243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571139 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:27.571243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571148 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:27.571932 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571921 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:27.571932 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.571931 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:27.575221 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.575205 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:27.575306 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.575242 2579 server.go:1295] "Started kubelet" Apr 16 22:13:27.575370 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.575326 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:27.575423 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.575394 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:27.575468 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.575339 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:27.576089 ip-10-0-136-39 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:27.576475 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.576459 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:27.577573 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.577557 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:27.578505 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.578473 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:27.578590 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.578545 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-39.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:27.578673 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.578656 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-39.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:27.581414 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.581393 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:27.581795 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.581783 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:27.585301 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.583910 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:27.585301 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.583901 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:27.585301 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.584008 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:27.585301 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.584232 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:27.585301 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.584850 2579 factory.go:55] Registering systemd factory Apr 16 22:13:27.585301 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.584864 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:27.585594 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.585339 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:27.585594 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.585354 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:27.586317 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.585973 2579 factory.go:153] Registering CRI-O factory Apr 16 22:13:27.586317 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.585991 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:27.586317 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.586104 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:27.586317 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.586126 2579 factory.go:103] Registering Raw factory Apr 16 22:13:27.586317 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.586142 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:27.586581 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.586403 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:27.586629 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.586619 2579 manager.go:319] Starting recovery of all containers Apr 16 22:13:27.587695 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.586543 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-39.ec2.internal.18a6f605be7aa3dd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-39.ec2.internal,UID:ip-10-0-136-39.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-39.ec2.internal,},FirstTimestamp:2026-04-16 22:13:27.575217117 +0000 UTC m=+0.420986021,LastTimestamp:2026-04-16 22:13:27.575217117 +0000 UTC m=+0.420986021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-39.ec2.internal,}" Apr 16 22:13:27.590619 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.590589 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-39.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 22:13:27.590843 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.590822 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 22:13:27.597501 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.597483 2579 manager.go:324] Recovery completed Apr 16 22:13:27.602160 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.602147 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:27.604621 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.604595 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:27.604682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.604634 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:27.604682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.604644 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:27.605138 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.605126 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:27.605205 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.605140 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:27.605205 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.605156 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:27.606469 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.606408 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-39.ec2.internal.18a6f605c03b50fa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-39.ec2.internal,UID:ip-10-0-136-39.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-39.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-39.ec2.internal,},FirstTimestamp:2026-04-16 22:13:27.604621562 +0000 UTC m=+0.450390465,LastTimestamp:2026-04-16 22:13:27.604621562 +0000 UTC m=+0.450390465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-39.ec2.internal,}" Apr 16 22:13:27.607130 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.607115 2579 policy_none.go:49] "None policy: Start" Apr 16 22:13:27.607130 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.607132 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:27.607232 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.607142 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:27.612244 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.612226 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6t7g7" Apr 16 22:13:27.614063 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.614001 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-39.ec2.internal.18a6f605c03b94a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-39.ec2.internal,UID:ip-10-0-136-39.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-136-39.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-136-39.ec2.internal,},FirstTimestamp:2026-04-16 22:13:27.604638882 +0000 UTC m=+0.450407784,LastTimestamp:2026-04-16 22:13:27.604638882 +0000 UTC m=+0.450407784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-39.ec2.internal,}" Apr 16 22:13:27.617504 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.617485 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6t7g7" Apr 16 22:13:27.646295 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.646278 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.646311 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.646320 2579 server.go:85] "Starting device plugin registration server" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.646501 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.646510 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.646618 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.646684 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.646690 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.647077 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:27.649259 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.647129 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:27.670419 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.670397 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:27.671565 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.671543 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:27.671633 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.671570 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:27.671633 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.671584 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:27.671633 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.671591 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:27.671773 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.671650 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:27.673865 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.673845 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:27.747077 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.747022 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:27.748542 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.748527 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:27.748637 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.748555 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:27.748637 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.748565 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:27.748637 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.748586 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.757018 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.757003 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.757081 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.757023 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-39.ec2.internal\": node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:27.771294 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.771278 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:27.772309 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.772295 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal"] Apr 16 22:13:27.772355 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.772351 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:27.773509 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.773497 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:27.773562 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.773521 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:27.773562 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.773531 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:27.774622 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.774609 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:27.774747 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.774713 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.774789 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.774749 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:27.775327 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.775311 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:27.775327 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.775319 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:27.775449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.775338 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:27.775449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.775340 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:27.775449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.775372 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:27.775449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.775352 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:27.776466 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.776453 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.776519 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.776476 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:27.777193 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.777175 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:27.777269 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.777203 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:27.777269 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.777215 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:27.786075 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.786060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70d54601393f1a639ae09597c45fde47-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal\" (UID: \"70d54601393f1a639ae09597c45fde47\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.786139 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.786083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d26c4d7ef73b8123da4c66a608d9e63b-config\") pod \"kube-apiserver-proxy-ip-10-0-136-39.ec2.internal\" (UID: \"d26c4d7ef73b8123da4c66a608d9e63b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.786139 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.786100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/70d54601393f1a639ae09597c45fde47-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal\" (UID: \"70d54601393f1a639ae09597c45fde47\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.795673 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.795645 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-39.ec2.internal\" not found" node="ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.800226 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.800207 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-39.ec2.internal\" not found" node="ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.872199 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.872181 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:27.886476 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.886456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/70d54601393f1a639ae09597c45fde47-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal\" (UID: \"70d54601393f1a639ae09597c45fde47\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.886535 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.886488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70d54601393f1a639ae09597c45fde47-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal\" (UID: \"70d54601393f1a639ae09597c45fde47\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.886535 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.886505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d26c4d7ef73b8123da4c66a608d9e63b-config\") pod \"kube-apiserver-proxy-ip-10-0-136-39.ec2.internal\" (UID: \"d26c4d7ef73b8123da4c66a608d9e63b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.886611 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.886544 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70d54601393f1a639ae09597c45fde47-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal\" (UID: \"70d54601393f1a639ae09597c45fde47\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.886611 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.886550 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/70d54601393f1a639ae09597c45fde47-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal\" (UID: \"70d54601393f1a639ae09597c45fde47\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.886611 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:27.886604 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d26c4d7ef73b8123da4c66a608d9e63b-config\") pod \"kube-apiserver-proxy-ip-10-0-136-39.ec2.internal\" (UID: \"d26c4d7ef73b8123da4c66a608d9e63b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" Apr 16 22:13:27.972605 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:27.972577 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.073442 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.073386 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.098904 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.098884 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:28.102356 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.102342 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" Apr 16 22:13:28.173885 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.173851 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.274400 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.274385 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.375027 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.374969 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.475595 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.475568 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.500150 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.500124 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:28.500695 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.500264 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:28.575807 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.575628 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.582016 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.581998 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:28.599460 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.599443 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:28.609830 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:28.609799 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26c4d7ef73b8123da4c66a608d9e63b.slice/crio-31f9a0f88fe87afd2d9045a41d6b487d9dcb6feb2c19cda6ff8f519308a3c017 WatchSource:0}: Error finding container 31f9a0f88fe87afd2d9045a41d6b487d9dcb6feb2c19cda6ff8f519308a3c017: Status 404 returned error can't find the container with id 31f9a0f88fe87afd2d9045a41d6b487d9dcb6feb2c19cda6ff8f519308a3c017 Apr 16 22:13:28.610284 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:28.610262 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70d54601393f1a639ae09597c45fde47.slice/crio-ca76fa12b46209e79d872f2861709135893e8752797d9e367e4930f808131314 WatchSource:0}: Error finding container ca76fa12b46209e79d872f2861709135893e8752797d9e367e4930f808131314: Status 404 returned error can't find the container with id ca76fa12b46209e79d872f2861709135893e8752797d9e367e4930f808131314 Apr 16 22:13:28.614664 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.614638 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:28.619297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.619274 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:27 +0000 UTC" deadline="2028-01-24 01:15:05.551399948 +0000 UTC" Apr 16 22:13:28.619297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.619297 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15531h1m36.932105435s" Apr 16 22:13:28.628092 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.628032 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8rggn" Apr 16 22:13:28.637009 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.636990 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8rggn" Apr 16 22:13:28.674146 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.674110 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" event={"ID":"70d54601393f1a639ae09597c45fde47","Type":"ContainerStarted","Data":"ca76fa12b46209e79d872f2861709135893e8752797d9e367e4930f808131314"} Apr 16 22:13:28.674947 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.674929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" event={"ID":"d26c4d7ef73b8123da4c66a608d9e63b","Type":"ContainerStarted","Data":"31f9a0f88fe87afd2d9045a41d6b487d9dcb6feb2c19cda6ff8f519308a3c017"} Apr 16 22:13:28.676002 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.675987 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.776617 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.776588 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.808713 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:28.808694 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:28.877670 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.877642 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:28.978299 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:28.978249 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-39.ec2.internal\" not found" Apr 16 22:13:29.042772 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.042752 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:29.076261 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.076236 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:29.082294 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.082276 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" Apr 16 22:13:29.095805 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.095783 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:29.097143 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.097123 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" Apr 16 22:13:29.105529 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.105508 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:29.419776 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.419695 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:29.566092 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.566054 2579 apiserver.go:52] "Watching apiserver" Apr 16 22:13:29.576451 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.576429 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:29.578268 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.578243 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8","openshift-cluster-node-tuning-operator/tuned-8x5qm","openshift-image-registry/node-ca-flbzq","openshift-multus/multus-5c5ns","openshift-multus/multus-additional-cni-plugins-knsc9","openshift-network-operator/iptables-alerter-8r8rq","kube-system/konnectivity-agent-fjlpn","kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal","openshift-multus/network-metrics-daemon-qgfjd","openshift-network-diagnostics/network-check-target-6978g","openshift-ovn-kubernetes/ovnkube-node-z4npz"] Apr 16 22:13:29.582375 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.582354 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.582679 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.582583 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.584620 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.584563 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.585034 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.585009 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.585577 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.585555 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.585677 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.585624 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5bh96\"" Apr 16 22:13:29.586306 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.585838 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.586306 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.585197 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:29.586306 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.586106 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:29.586306 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.586148 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.586306 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.586181 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jgllw\"" Apr 16 22:13:29.588324 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.586933 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7lvjq\"" Apr 16 22:13:29.588324 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.587158 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.588324 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.587188 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:29.588324 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.587424 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.588324 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.587880 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:29.590050 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.590023 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.590145 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.590111 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.592299 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.592282 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vpbk8\"" Apr 16 22:13:29.592397 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.592284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:29.592502 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.592484 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jhztv\"" Apr 16 22:13:29.592567 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.592489 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:29.592567 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.592557 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.592829 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.592745 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:29.592924 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.592895 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.592996 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.592974 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:29.595089 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.594966 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:29.595089 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.594975 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:29.595089 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.595001 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-82tk7\"" Apr 16 22:13:29.596630 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596606 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-cni-multus\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.596702 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-multus-certs\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.596702 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596672 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-sys-fs\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.596815 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/598903e0-6d7b-4392-b685-da66c0408923-serviceca\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.596815 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596744 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-cni-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.596815 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-socket-dir-parent\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.596815 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-netns\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.596815 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596812 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:29.597045 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596830 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8r6b\" (UniqueName: \"kubernetes.io/projected/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-kube-api-access-c8r6b\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.597045 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596855 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-os-release\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597045 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-daemon-config\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597045 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:29.596893 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:29.597045 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596903 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/598903e0-6d7b-4392-b685-da66c0408923-host\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.597045 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwld\" (UniqueName: \"kubernetes.io/projected/598903e0-6d7b-4392-b685-da66c0408923-kube-api-access-tqwld\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.597045 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.596991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-cni-bin\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597045 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597086 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-kubelet\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-conf-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597136 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-device-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597159 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-k8s-cni-cncf-io\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-etc-kubernetes\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597205 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-etc-selinux\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597228 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-system-cni-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-registration-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597275 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-cnibin\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597297 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-cni-binary-copy\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597319 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xjzv\" (UniqueName: \"kubernetes.io/projected/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-kube-api-access-2xjzv\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-socket-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.597404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.597387 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-hostroot\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.599068 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.599048 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.599480 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.599268 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.601177 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.601149 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:29.601263 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:29.601211 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:29.601852 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.601837 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:29.601922 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.601876 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.601978 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.601950 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.602070 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.602048 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.602704 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.602685 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:29.602798 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.602739 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:29.603093 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.602972 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.603093 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.602991 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:29.603093 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.602999 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6j4vn\"" Apr 16 22:13:29.603093 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.603012 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dkljr\"" Apr 16 22:13:29.637764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.637736 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:28 +0000 UTC" deadline="2027-11-29 11:49:54.964588917 +0000 UTC" Apr 16 22:13:29.637764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.637765 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14197h36m25.326828844s" Apr 16 22:13:29.684834 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.684779 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:29.698137 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-hostroot\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698224 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698159 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-cnibin\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.698224 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698186 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-hostroot\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698224 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698188 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65407a6-d8b9-47f5-ac3f-231ddd09de73-iptables-alerter-script\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.698358 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698245 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-slash\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.698358 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-os-release\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.698358 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698295 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhs2b\" (UniqueName: \"kubernetes.io/projected/c65407a6-d8b9-47f5-ac3f-231ddd09de73-kube-api-access-zhs2b\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.698358 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-cni-multus\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-sys-fs\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/598903e0-6d7b-4392-b685-da66c0408923-serviceca\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-cni-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-socket-dir-parent\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-sys-fs\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8r6b\" (UniqueName: \"kubernetes.io/projected/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-kube-api-access-c8r6b\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-os-release\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698423 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-cni-multus\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-cni-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698517 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698502 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxw2l\" (UniqueName: \"kubernetes.io/projected/a2d7d39e-d19f-4a6e-8107-593903f29181-kube-api-access-mxw2l\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698522 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-socket-dir-parent\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698550 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-os-release\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-netns\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysconfig\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698599 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-netns\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-kubernetes\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698645 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysctl-conf\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwld\" (UniqueName: \"kubernetes.io/projected/598903e0-6d7b-4392-b685-da66c0408923-kube-api-access-tqwld\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698685 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-cni-bin\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-cni-netd\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698737 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-var-lib-kubelet\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698757 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-cni-bin\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/57d6ffd8-4f1e-4169-adf4-276bac26da26-agent-certs\") pod \"konnectivity-agent-fjlpn\" (UID: \"57d6ffd8-4f1e-4169-adf4-276bac26da26\") " pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-kubelet\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698837 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-conf-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.698993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/598903e0-6d7b-4392-b685-da66c0408923-serviceca\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mhmf\" (UniqueName: \"kubernetes.io/projected/3894fa51-bc91-4390-ab13-ef051552e33a-kube-api-access-7mhmf\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-conf-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698868 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-var-lib-kubelet\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698934 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-run\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.698971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-host\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699013 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-k8s-cni-cncf-io\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-kubelet\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699077 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-var-lib-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-k8s-cni-cncf-io\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699139 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-run-ovn-kubernetes\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-ovnkube-config\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699192 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-ovnkube-script-lib\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699215 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-sys\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-system-cni-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-system-cni-dir\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.699625 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699327 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-env-overrides\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699355 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-systemd\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699376 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-lib-modules\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-registration-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-cnibin\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-cni-binary-copy\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xjzv\" (UniqueName: \"kubernetes.io/projected/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-kube-api-access-2xjzv\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699479 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-systemd-units\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-registration-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699517 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-ovn\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699646 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-cnibin\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-socket-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699714 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699761 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-socket-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/57d6ffd8-4f1e-4169-adf4-276bac26da26-konnectivity-ca\") pod \"konnectivity-agent-fjlpn\" (UID: \"57d6ffd8-4f1e-4169-adf4-276bac26da26\") " pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:29.700292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699918 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-log-socket\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-multus-certs\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.699999 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-host-run-multus-certs\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700002 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-cni-bin\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3894fa51-bc91-4390-ab13-ef051552e33a-ovn-node-metrics-cert\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-daemon-config\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-run-netns\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700124 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-systemd\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700149 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-etc-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-modprobe-d\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/598903e0-6d7b-4392-b685-da66c0408923-host\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700233 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/598903e0-6d7b-4392-b685-da66c0408923-host\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700264 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-cni-binary-copy\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysctl-d\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700319 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-tuned\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcd5r\" (UniqueName: \"kubernetes.io/projected/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-kube-api-access-rcd5r\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.700998 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700418 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qtc\" (UniqueName: \"kubernetes.io/projected/bcee1191-76cc-4c41-ad57-b41d75589f20-kube-api-access-q6qtc\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700443 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-node-log\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-device-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-cni-binary-copy\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-etc-kubernetes\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700525 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-etc-kubernetes\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-multus-daemon-config\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700541 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700565 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-device-dir\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-etc-selinux\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700604 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-system-cni-dir\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65407a6-d8b9-47f5-ac3f-231ddd09de73-host-slash\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700661 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700622 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-etc-selinux\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.701493 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.700683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-tmp\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.709319 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.709280 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:29.712918 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.712896 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwld\" (UniqueName: \"kubernetes.io/projected/598903e0-6d7b-4392-b685-da66c0408923-kube-api-access-tqwld\") pod \"node-ca-flbzq\" (UID: \"598903e0-6d7b-4392-b685-da66c0408923\") " pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.712918 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.712908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8r6b\" (UniqueName: \"kubernetes.io/projected/74b3275a-d05c-43f4-a4f9-59f1e5fffbed-kube-api-access-c8r6b\") pod \"aws-ebs-csi-driver-node-smnf8\" (UID: \"74b3275a-d05c-43f4-a4f9-59f1e5fffbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.713082 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.712898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xjzv\" (UniqueName: \"kubernetes.io/projected/e3a1ce43-bbf9-45df-abbd-7ec6821f991b-kube-api-access-2xjzv\") pod \"multus-5c5ns\" (UID: \"e3a1ce43-bbf9-45df-abbd-7ec6821f991b\") " pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.801077 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-cnibin\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.801077 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65407a6-d8b9-47f5-ac3f-231ddd09de73-iptables-alerter-script\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.801282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-slash\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.801282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801123 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-os-release\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.801282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhs2b\" (UniqueName: \"kubernetes.io/projected/c65407a6-d8b9-47f5-ac3f-231ddd09de73-kube-api-access-zhs2b\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.801282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2l\" (UniqueName: \"kubernetes.io/projected/a2d7d39e-d19f-4a6e-8107-593903f29181-kube-api-access-mxw2l\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:29.801282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801171 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-cnibin\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.801282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801208 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.801282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-slash\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.801282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysconfig\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801285 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-os-release\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801303 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysconfig\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801376 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-kubernetes\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysctl-conf\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-cni-netd\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-var-lib-kubelet\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-kubernetes\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/57d6ffd8-4f1e-4169-adf4-276bac26da26-agent-certs\") pod \"konnectivity-agent-fjlpn\" (UID: \"57d6ffd8-4f1e-4169-adf4-276bac26da26\") " pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801536 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801564 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mhmf\" (UniqueName: \"kubernetes.io/projected/3894fa51-bc91-4390-ab13-ef051552e33a-kube-api-access-7mhmf\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.801582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-run\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-host\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-kubelet\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-var-lib-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65407a6-d8b9-47f5-ac3f-231ddd09de73-iptables-alerter-script\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801678 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-run-ovn-kubernetes\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysctl-conf\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801712 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-var-lib-kubelet\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-ovnkube-config\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801816 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-cni-netd\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801816 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-run-ovn-kubernetes\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801881 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-host\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:29.801961 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:29.802010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801998 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-kubelet\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:29.802035 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:30.30200334 +0000 UTC m=+3.147772230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-run\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802052 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-ovnkube-script-lib\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-sys\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-env-overrides\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-systemd\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-lib-modules\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-systemd-units\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.801959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-var-lib-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-ovn\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802247 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-ovn\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802265 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-ovnkube-config\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-sys\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802288 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.802357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-systemd\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802334 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802362 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/57d6ffd8-4f1e-4169-adf4-276bac26da26-konnectivity-ca\") pod \"konnectivity-agent-fjlpn\" (UID: \"57d6ffd8-4f1e-4169-adf4-276bac26da26\") " pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-lib-modules\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-log-socket\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-ovnkube-script-lib\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-cni-bin\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802419 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-systemd-units\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3894fa51-bc91-4390-ab13-ef051552e33a-ovn-node-metrics-cert\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-cni-bin\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802461 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-run-netns\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-run-netns\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-systemd\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802536 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-etc-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-modprobe-d\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-cni-binary-copy\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802622 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-systemd\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.802999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3894fa51-bc91-4390-ab13-ef051552e33a-env-overrides\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802681 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-etc-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802711 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysctl-d\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802835 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-modprobe-d\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802870 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-log-socket\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-tuned\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcd5r\" (UniqueName: \"kubernetes.io/projected/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-kube-api-access-rcd5r\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qtc\" (UniqueName: \"kubernetes.io/projected/bcee1191-76cc-4c41-ad57-b41d75589f20-kube-api-access-q6qtc\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.802978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-node-log\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803008 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803089 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/57d6ffd8-4f1e-4169-adf4-276bac26da26-konnectivity-ca\") pod \"konnectivity-agent-fjlpn\" (UID: \"57d6ffd8-4f1e-4169-adf4-276bac26da26\") " pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803235 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-node-log\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803258 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-sysctl-d\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-system-cni-dir\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.803783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65407a6-d8b9-47f5-ac3f-231ddd09de73-host-slash\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.804558 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.804558 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65407a6-d8b9-47f5-ac3f-231ddd09de73-host-slash\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.804558 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-tmp\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.804558 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803436 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3894fa51-bc91-4390-ab13-ef051552e33a-run-openvswitch\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.804558 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcee1191-76cc-4c41-ad57-b41d75589f20-cni-binary-copy\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.804558 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.803578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcee1191-76cc-4c41-ad57-b41d75589f20-system-cni-dir\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.804558 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.804049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/57d6ffd8-4f1e-4169-adf4-276bac26da26-agent-certs\") pod \"konnectivity-agent-fjlpn\" (UID: \"57d6ffd8-4f1e-4169-adf4-276bac26da26\") " pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:29.805203 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.805179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3894fa51-bc91-4390-ab13-ef051552e33a-ovn-node-metrics-cert\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.805661 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.805641 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-etc-tuned\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.805760 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.805683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-tmp\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.819900 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:29.819881 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:29.820032 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:29.819902 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:29.820032 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:29.819915 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9kfzr for pod openshift-network-diagnostics/network-check-target-6978g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:29.820223 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:29.820208 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr podName:e531ad1d-2d55-48e3-afc2-f5404821539c nodeName:}" failed. No retries permitted until 2026-04-16 22:13:30.32018841 +0000 UTC m=+3.165957310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9kfzr" (UniqueName: "kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr") pod "network-check-target-6978g" (UID: "e531ad1d-2d55-48e3-afc2-f5404821539c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:29.823915 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.823833 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxw2l\" (UniqueName: \"kubernetes.io/projected/a2d7d39e-d19f-4a6e-8107-593903f29181-kube-api-access-mxw2l\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:29.824535 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.824384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qtc\" (UniqueName: \"kubernetes.io/projected/bcee1191-76cc-4c41-ad57-b41d75589f20-kube-api-access-q6qtc\") pod \"multus-additional-cni-plugins-knsc9\" (UID: \"bcee1191-76cc-4c41-ad57-b41d75589f20\") " pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.825258 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.825233 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhs2b\" (UniqueName: \"kubernetes.io/projected/c65407a6-d8b9-47f5-ac3f-231ddd09de73-kube-api-access-zhs2b\") pod \"iptables-alerter-8r8rq\" (UID: \"c65407a6-d8b9-47f5-ac3f-231ddd09de73\") " pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.825429 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.825408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcd5r\" (UniqueName: \"kubernetes.io/projected/dae5bcc7-87a6-403b-ab23-dc1fc36c0615-kube-api-access-rcd5r\") pod \"tuned-8x5qm\" (UID: \"dae5bcc7-87a6-403b-ab23-dc1fc36c0615\") " pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:29.826591 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.826571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mhmf\" (UniqueName: \"kubernetes.io/projected/3894fa51-bc91-4390-ab13-ef051552e33a-kube-api-access-7mhmf\") pod \"ovnkube-node-z4npz\" (UID: \"3894fa51-bc91-4390-ab13-ef051552e33a\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.897806 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.897779 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" Apr 16 22:13:29.903463 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.903433 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flbzq" Apr 16 22:13:29.911137 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.911119 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5c5ns" Apr 16 22:13:29.916672 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.916657 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-knsc9" Apr 16 22:13:29.926201 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.926185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8r8rq" Apr 16 22:13:29.932718 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.932699 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:29.938342 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.938285 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:29.943830 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:29.943814 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" Apr 16 22:13:30.294214 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:30.294083 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc65407a6_d8b9_47f5_ac3f_231ddd09de73.slice/crio-41213f33333770cccb72a3322f891283341693baecb0ee32c503ed0a77281e56 WatchSource:0}: Error finding container 41213f33333770cccb72a3322f891283341693baecb0ee32c503ed0a77281e56: Status 404 returned error can't find the container with id 41213f33333770cccb72a3322f891283341693baecb0ee32c503ed0a77281e56 Apr 16 22:13:30.297298 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:30.297275 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a1ce43_bbf9_45df_abbd_7ec6821f991b.slice/crio-4261b016f03a2925fa8d8574929891e74ebcb8e59eef75a233f1348db8477ce1 WatchSource:0}: Error finding container 4261b016f03a2925fa8d8574929891e74ebcb8e59eef75a233f1348db8477ce1: Status 404 returned error can't find the container with id 4261b016f03a2925fa8d8574929891e74ebcb8e59eef75a233f1348db8477ce1 Apr 16 22:13:30.298557 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:30.298522 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddae5bcc7_87a6_403b_ab23_dc1fc36c0615.slice/crio-3a720b04d45363f694fa1b619d383ebae340d1d69029c8d8f42993c132bfe16f WatchSource:0}: Error finding container 3a720b04d45363f694fa1b619d383ebae340d1d69029c8d8f42993c132bfe16f: Status 404 returned error can't find the container with id 3a720b04d45363f694fa1b619d383ebae340d1d69029c8d8f42993c132bfe16f Apr 16 22:13:30.300006 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:30.299906 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d6ffd8_4f1e_4169_adf4_276bac26da26.slice/crio-d0de77eec3ef4c3d40bc9e4b1c8c0d6efebeae22a95d1165ea2cf3891e9e3323 WatchSource:0}: Error finding container d0de77eec3ef4c3d40bc9e4b1c8c0d6efebeae22a95d1165ea2cf3891e9e3323: Status 404 returned error can't find the container with id d0de77eec3ef4c3d40bc9e4b1c8c0d6efebeae22a95d1165ea2cf3891e9e3323 Apr 16 22:13:30.301032 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:30.300918 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcee1191_76cc_4c41_ad57_b41d75589f20.slice/crio-286d79ff20d31ba2c6bc1d0c92ba6d735a2fd7a7999d513a2530dfe23acf3cca WatchSource:0}: Error finding container 286d79ff20d31ba2c6bc1d0c92ba6d735a2fd7a7999d513a2530dfe23acf3cca: Status 404 returned error can't find the container with id 286d79ff20d31ba2c6bc1d0c92ba6d735a2fd7a7999d513a2530dfe23acf3cca Apr 16 22:13:30.301773 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:30.301496 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598903e0_6d7b_4392_b685_da66c0408923.slice/crio-ce54fccb6e380991fe5d4f3ecf2aa0717d8dacb07d6e498512b4dceee41e778a WatchSource:0}: Error finding container ce54fccb6e380991fe5d4f3ecf2aa0717d8dacb07d6e498512b4dceee41e778a: Status 404 returned error can't find the container with id ce54fccb6e380991fe5d4f3ecf2aa0717d8dacb07d6e498512b4dceee41e778a Apr 16 22:13:30.303095 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:30.303047 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74b3275a_d05c_43f4_a4f9_59f1e5fffbed.slice/crio-19378e4043e7e054cff4a0e6e7af128e5cadd983e8d8fde7f4d730ec379add5a WatchSource:0}: Error finding container 19378e4043e7e054cff4a0e6e7af128e5cadd983e8d8fde7f4d730ec379add5a: Status 404 returned error can't find the container with id 19378e4043e7e054cff4a0e6e7af128e5cadd983e8d8fde7f4d730ec379add5a Apr 16 22:13:30.303795 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:30.303777 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3894fa51_bc91_4390_ab13_ef051552e33a.slice/crio-b98a9b8bdbed7e9fb2d9675b22a2285be5caeb2f7f506af4e3fbf5f2ad4afb9b WatchSource:0}: Error finding container b98a9b8bdbed7e9fb2d9675b22a2285be5caeb2f7f506af4e3fbf5f2ad4afb9b: Status 404 returned error can't find the container with id b98a9b8bdbed7e9fb2d9675b22a2285be5caeb2f7f506af4e3fbf5f2ad4afb9b Apr 16 22:13:30.306210 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.306193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:30.306291 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:30.306281 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:30.306331 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:30.306323 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:31.306310209 +0000 UTC m=+4.152079113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:30.406740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.406696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:30.406881 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:30.406864 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:30.406918 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:30.406885 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:30.406918 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:30.406894 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9kfzr for pod openshift-network-diagnostics/network-check-target-6978g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:30.406994 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:30.406935 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr podName:e531ad1d-2d55-48e3-afc2-f5404821539c nodeName:}" failed. No retries permitted until 2026-04-16 22:13:31.406922086 +0000 UTC m=+4.252690993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9kfzr" (UniqueName: "kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr") pod "network-check-target-6978g" (UID: "e531ad1d-2d55-48e3-afc2-f5404821539c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:30.638514 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.638431 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:28 +0000 UTC" deadline="2027-09-26 10:30:51.84746952 +0000 UTC" Apr 16 22:13:30.638514 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.638464 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12660h17m21.209009706s" Apr 16 22:13:30.678686 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.678650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" event={"ID":"74b3275a-d05c-43f4-a4f9-59f1e5fffbed","Type":"ContainerStarted","Data":"19378e4043e7e054cff4a0e6e7af128e5cadd983e8d8fde7f4d730ec379add5a"} Apr 16 22:13:30.680138 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.680104 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" event={"ID":"dae5bcc7-87a6-403b-ab23-dc1fc36c0615","Type":"ContainerStarted","Data":"3a720b04d45363f694fa1b619d383ebae340d1d69029c8d8f42993c132bfe16f"} Apr 16 22:13:30.681952 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.681924 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8r8rq" event={"ID":"c65407a6-d8b9-47f5-ac3f-231ddd09de73","Type":"ContainerStarted","Data":"41213f33333770cccb72a3322f891283341693baecb0ee32c503ed0a77281e56"} Apr 16 22:13:30.683633 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.683594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"b98a9b8bdbed7e9fb2d9675b22a2285be5caeb2f7f506af4e3fbf5f2ad4afb9b"} Apr 16 22:13:30.685398 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.685373 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flbzq" event={"ID":"598903e0-6d7b-4392-b685-da66c0408923","Type":"ContainerStarted","Data":"ce54fccb6e380991fe5d4f3ecf2aa0717d8dacb07d6e498512b4dceee41e778a"} Apr 16 22:13:30.686832 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.686776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerStarted","Data":"286d79ff20d31ba2c6bc1d0c92ba6d735a2fd7a7999d513a2530dfe23acf3cca"} Apr 16 22:13:30.688899 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.688877 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fjlpn" event={"ID":"57d6ffd8-4f1e-4169-adf4-276bac26da26","Type":"ContainerStarted","Data":"d0de77eec3ef4c3d40bc9e4b1c8c0d6efebeae22a95d1165ea2cf3891e9e3323"} Apr 16 22:13:30.693937 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.692091 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5c5ns" event={"ID":"e3a1ce43-bbf9-45df-abbd-7ec6821f991b","Type":"ContainerStarted","Data":"4261b016f03a2925fa8d8574929891e74ebcb8e59eef75a233f1348db8477ce1"} Apr 16 22:13:30.694938 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.694914 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" event={"ID":"d26c4d7ef73b8123da4c66a608d9e63b","Type":"ContainerStarted","Data":"24bf5bde84f95fe20e4105c454f1c55199f114cb9020b272c99b964d2be131f1"} Apr 16 22:13:30.710662 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:30.710609 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-39.ec2.internal" podStartSLOduration=1.710594486 podStartE2EDuration="1.710594486s" podCreationTimestamp="2026-04-16 22:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:30.710030732 +0000 UTC m=+3.555799645" watchObservedRunningTime="2026-04-16 22:13:30.710594486 +0000 UTC m=+3.556363377" Apr 16 22:13:31.314135 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:31.314072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:31.314279 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:31.314219 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:31.314336 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:31.314284 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:33.314266273 +0000 UTC m=+6.160035164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:31.414738 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:31.414642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:31.414889 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:31.414814 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:31.414889 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:31.414836 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:31.414889 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:31.414849 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9kfzr for pod openshift-network-diagnostics/network-check-target-6978g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:31.415038 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:31.414906 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr podName:e531ad1d-2d55-48e3-afc2-f5404821539c nodeName:}" failed. No retries permitted until 2026-04-16 22:13:33.414886963 +0000 UTC m=+6.260655874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9kfzr" (UniqueName: "kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr") pod "network-check-target-6978g" (UID: "e531ad1d-2d55-48e3-afc2-f5404821539c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:31.672703 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:31.672005 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:31.672703 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:31.672144 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:31.672703 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:31.672537 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:31.672703 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:31.672628 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:31.720598 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:31.720564 2579 generic.go:358] "Generic (PLEG): container finished" podID="70d54601393f1a639ae09597c45fde47" containerID="56895f0eee1c897ffd87c2c90235508823a3d4a5d7870cd8deea373aa5b15c47" exitCode=0 Apr 16 22:13:31.721437 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:31.721409 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" event={"ID":"70d54601393f1a639ae09597c45fde47","Type":"ContainerDied","Data":"56895f0eee1c897ffd87c2c90235508823a3d4a5d7870cd8deea373aa5b15c47"} Apr 16 22:13:32.728396 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:32.728341 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" event={"ID":"70d54601393f1a639ae09597c45fde47","Type":"ContainerStarted","Data":"f29d2460a1fbe144e9df540598217cc314376fd6e4e9ba2e083b0bd532ded82b"} Apr 16 22:13:33.331485 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:33.331453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:33.331667 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:33.331608 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:33.331667 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:33.331664 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:37.331646905 +0000 UTC m=+10.177415809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:33.432648 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:33.432615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:33.432837 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:33.432782 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:33.432837 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:33.432804 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:33.432837 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:33.432818 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9kfzr for pod openshift-network-diagnostics/network-check-target-6978g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:33.432996 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:33.432880 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr podName:e531ad1d-2d55-48e3-afc2-f5404821539c nodeName:}" failed. No retries permitted until 2026-04-16 22:13:37.432861339 +0000 UTC m=+10.278630229 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9kfzr" (UniqueName: "kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr") pod "network-check-target-6978g" (UID: "e531ad1d-2d55-48e3-afc2-f5404821539c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:33.673001 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:33.672479 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:33.673001 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:33.672501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:33.673001 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:33.672606 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:33.673001 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:33.672842 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:35.672574 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:35.671898 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:35.672574 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:35.672034 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:35.672574 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:35.672389 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:35.672574 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:35.672531 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:36.346757 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.344680 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-39.ec2.internal" podStartSLOduration=7.344660262 podStartE2EDuration="7.344660262s" podCreationTimestamp="2026-04-16 22:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:32.743260283 +0000 UTC m=+5.589029220" watchObservedRunningTime="2026-04-16 22:13:36.344660262 +0000 UTC m=+9.190429176" Apr 16 22:13:36.347964 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.347570 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-msw8q"] Apr 16 22:13:36.351125 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.351102 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.354603 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.354404 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k2vjm\"" Apr 16 22:13:36.354702 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.354644 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:36.355614 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.355432 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:36.458894 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.458851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk72f\" (UniqueName: \"kubernetes.io/projected/e7417e8e-90d4-47a8-926c-b10f15f3a850-kube-api-access-rk72f\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.459056 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.458922 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7417e8e-90d4-47a8-926c-b10f15f3a850-tmp-dir\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.459056 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.458979 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7417e8e-90d4-47a8-926c-b10f15f3a850-hosts-file\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.559841 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.559792 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7417e8e-90d4-47a8-926c-b10f15f3a850-hosts-file\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.560036 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.559878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk72f\" (UniqueName: \"kubernetes.io/projected/e7417e8e-90d4-47a8-926c-b10f15f3a850-kube-api-access-rk72f\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.560036 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.559925 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7417e8e-90d4-47a8-926c-b10f15f3a850-tmp-dir\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.560194 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.560159 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7417e8e-90d4-47a8-926c-b10f15f3a850-hosts-file\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.560612 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.560583 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7417e8e-90d4-47a8-926c-b10f15f3a850-tmp-dir\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.570948 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.570655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk72f\" (UniqueName: \"kubernetes.io/projected/e7417e8e-90d4-47a8-926c-b10f15f3a850-kube-api-access-rk72f\") pod \"node-resolver-msw8q\" (UID: \"e7417e8e-90d4-47a8-926c-b10f15f3a850\") " pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:36.662028 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:36.661955 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-msw8q" Apr 16 22:13:37.366996 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:37.366920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:37.367427 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:37.367084 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:37.367427 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:37.367160 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.367136211 +0000 UTC m=+18.212905118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:37.467640 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:37.467600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:37.467813 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:37.467768 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:37.467813 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:37.467792 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:37.467813 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:37.467805 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9kfzr for pod openshift-network-diagnostics/network-check-target-6978g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:37.467991 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:37.467877 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr podName:e531ad1d-2d55-48e3-afc2-f5404821539c nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.467857207 +0000 UTC m=+18.313626114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9kfzr" (UniqueName: "kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr") pod "network-check-target-6978g" (UID: "e531ad1d-2d55-48e3-afc2-f5404821539c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:37.673158 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:37.673078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:37.673320 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:37.673188 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:37.673399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:37.673382 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:37.673498 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:37.673481 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:39.672621 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:39.672588 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:39.673115 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:39.672738 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:39.673115 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:39.672777 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:39.673115 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:39.672859 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:41.655084 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.655051 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wvd8j"] Apr 16 22:13:41.657989 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.657966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:41.658095 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:41.658057 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:41.671969 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.671933 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:41.671969 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.671965 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:41.672146 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:41.672064 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:41.672208 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:41.672151 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:41.797025 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.796988 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:41.797227 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.797065 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3e33b82-1d36-4e38-ae60-42189b25da6e-kubelet-config\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:41.797227 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.797084 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3e33b82-1d36-4e38-ae60-42189b25da6e-dbus\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:41.898264 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.898234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:41.898426 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.898291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3e33b82-1d36-4e38-ae60-42189b25da6e-kubelet-config\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:41.898426 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.898319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3e33b82-1d36-4e38-ae60-42189b25da6e-dbus\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:41.898426 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:41.898390 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:41.898582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.898446 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3e33b82-1d36-4e38-ae60-42189b25da6e-kubelet-config\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:41.898582 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:41.898463 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret podName:c3e33b82-1d36-4e38-ae60-42189b25da6e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:42.398442262 +0000 UTC m=+15.244211165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret") pod "global-pull-secret-syncer-wvd8j" (UID: "c3e33b82-1d36-4e38-ae60-42189b25da6e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:41.898582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:41.898514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3e33b82-1d36-4e38-ae60-42189b25da6e-dbus\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:42.403298 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:42.403268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:42.403455 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:42.403388 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:42.403455 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:42.403440 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret podName:c3e33b82-1d36-4e38-ae60-42189b25da6e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:43.403426855 +0000 UTC m=+16.249195750 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret") pod "global-pull-secret-syncer-wvd8j" (UID: "c3e33b82-1d36-4e38-ae60-42189b25da6e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:43.412234 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:43.412203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:43.412663 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:43.412359 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:43.412663 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:43.412424 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret podName:c3e33b82-1d36-4e38-ae60-42189b25da6e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.412410057 +0000 UTC m=+18.258178946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret") pod "global-pull-secret-syncer-wvd8j" (UID: "c3e33b82-1d36-4e38-ae60-42189b25da6e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:43.672117 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:43.672026 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:43.672117 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:43.672062 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:43.672333 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:43.672170 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:43.672333 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:43.672271 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:43.672432 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:43.672335 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:43.672482 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:43.672433 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:45.428719 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:45.428678 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:45.428719 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:45.428721 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:45.429343 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.428839 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.429343 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.428914 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.428895542 +0000 UTC m=+34.274664449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.429343 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.428839 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:45.429343 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.428977 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret podName:c3e33b82-1d36-4e38-ae60-42189b25da6e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.428958355 +0000 UTC m=+22.274727263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret") pod "global-pull-secret-syncer-wvd8j" (UID: "c3e33b82-1d36-4e38-ae60-42189b25da6e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:45.529383 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:45.529342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:45.529525 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.529504 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:45.529525 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.529523 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:45.529605 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.529533 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9kfzr for pod openshift-network-diagnostics/network-check-target-6978g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:45.529605 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.529585 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr podName:e531ad1d-2d55-48e3-afc2-f5404821539c nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.529571081 +0000 UTC m=+34.375339971 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9kfzr" (UniqueName: "kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr") pod "network-check-target-6978g" (UID: "e531ad1d-2d55-48e3-afc2-f5404821539c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:45.672822 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:45.672784 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:45.672822 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:45.672802 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:45.672822 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:45.672784 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:45.673092 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.672919 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:45.673092 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.672995 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:45.673092 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:45.673042 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:46.954653 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:13:46.954621 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7417e8e_90d4_47a8_926c_b10f15f3a850.slice/crio-05e16913b5605d73d90bd21a0ce6e9e1ed35f2f122037659ab0e62b326f9b8d0 WatchSource:0}: Error finding container 05e16913b5605d73d90bd21a0ce6e9e1ed35f2f122037659ab0e62b326f9b8d0: Status 404 returned error can't find the container with id 05e16913b5605d73d90bd21a0ce6e9e1ed35f2f122037659ab0e62b326f9b8d0 Apr 16 22:13:47.673200 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.672813 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:47.673299 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.672862 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:47.673299 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.672908 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:47.673415 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:47.673316 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:47.673466 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:47.673436 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:47.673544 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:47.673523 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:47.751692 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.751660 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" event={"ID":"74b3275a-d05c-43f4-a4f9-59f1e5fffbed","Type":"ContainerStarted","Data":"efe8f2e6094f1f360edc3459da6af5214b2ca5351ce0a07654efa317c4bc1bec"} Apr 16 22:13:47.753186 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.753127 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" event={"ID":"dae5bcc7-87a6-403b-ab23-dc1fc36c0615","Type":"ContainerStarted","Data":"5f685a8f8d206d6211e436859fdaf152c03c44ad4e625fc895fa08e7363c560c"} Apr 16 22:13:47.754568 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.754541 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-msw8q" event={"ID":"e7417e8e-90d4-47a8-926c-b10f15f3a850","Type":"ContainerStarted","Data":"b94793d3f3c3986479017e72f30e45f1bad009949129059937c0d550479a5142"} Apr 16 22:13:47.754678 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.754574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-msw8q" event={"ID":"e7417e8e-90d4-47a8-926c-b10f15f3a850","Type":"ContainerStarted","Data":"05e16913b5605d73d90bd21a0ce6e9e1ed35f2f122037659ab0e62b326f9b8d0"} Apr 16 22:13:47.756878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.756859 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:13:47.757192 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.757169 2579 generic.go:358] "Generic (PLEG): container finished" podID="3894fa51-bc91-4390-ab13-ef051552e33a" containerID="a2659546568bfbba27456fcddb9ecb38503ffd21e8faa9fa8e1a6b8580b61014" exitCode=1 Apr 16 22:13:47.757297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.757196 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"f084a8e74e373b93c0f29e78cad6daf28fa94c8cb3e937929a17a5eabcc50d6c"} Apr 16 22:13:47.757297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.757235 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"8636ab71d85f38d8dd09419d3ca9f9174ff636ac10d268a7b6267e5efa48140e"} Apr 16 22:13:47.757297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.757250 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"6b4848f388cb68eacc21c9f1da7ddd506ade30f88656236ec87520c353ecfbf1"} Apr 16 22:13:47.757297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.757262 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerDied","Data":"a2659546568bfbba27456fcddb9ecb38503ffd21e8faa9fa8e1a6b8580b61014"} Apr 16 22:13:47.757297 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.757278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"df2cbb7d2beac4a1d0bba76b346a1bea274cb78a574abeb1ac77cb6a847b39fe"} Apr 16 22:13:47.758596 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.758565 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flbzq" event={"ID":"598903e0-6d7b-4392-b685-da66c0408923","Type":"ContainerStarted","Data":"12d61ce6643fad6f051c62a40ef32ae0e5b1f383ef8d5d1c5b4b99a04c631293"} Apr 16 22:13:47.760023 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.759986 2579 generic.go:358] "Generic (PLEG): container finished" podID="bcee1191-76cc-4c41-ad57-b41d75589f20" containerID="411bd15d655ee345bc6930730d16e6f5913a42889a69e40c81ed9ef54d192cdc" exitCode=0 Apr 16 22:13:47.760108 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.760056 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerDied","Data":"411bd15d655ee345bc6930730d16e6f5913a42889a69e40c81ed9ef54d192cdc"} Apr 16 22:13:47.761465 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.761433 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fjlpn" event={"ID":"57d6ffd8-4f1e-4169-adf4-276bac26da26","Type":"ContainerStarted","Data":"b6def1d7b5dfc13ca3d98b91534162e8da0878cce6751b6dc2f48dc54d4c083e"} Apr 16 22:13:47.762749 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.762714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5c5ns" event={"ID":"e3a1ce43-bbf9-45df-abbd-7ec6821f991b","Type":"ContainerStarted","Data":"8a76fc9b0c0af47066aa7f5fd4275b1be28ef311af4d7258edc5511cd3491ef0"} Apr 16 22:13:47.772464 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.772419 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8x5qm" podStartSLOduration=3.077199467 podStartE2EDuration="19.772404326s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.300445436 +0000 UTC m=+3.146214329" lastFinishedPulling="2026-04-16 22:13:46.995650282 +0000 UTC m=+19.841419188" observedRunningTime="2026-04-16 22:13:47.771873627 +0000 UTC m=+20.617642538" watchObservedRunningTime="2026-04-16 22:13:47.772404326 +0000 UTC m=+20.618173233" Apr 16 22:13:47.787387 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.787342 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-msw8q" podStartSLOduration=11.787325706 podStartE2EDuration="11.787325706s" podCreationTimestamp="2026-04-16 22:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:47.787063472 +0000 UTC m=+20.632832383" watchObservedRunningTime="2026-04-16 22:13:47.787325706 +0000 UTC m=+20.633094631" Apr 16 22:13:47.821563 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.821518 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fjlpn" podStartSLOduration=8.704616502 podStartE2EDuration="20.821506088s" podCreationTimestamp="2026-04-16 22:13:27 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.301910951 +0000 UTC m=+3.147679848" lastFinishedPulling="2026-04-16 22:13:42.41880053 +0000 UTC m=+15.264569434" observedRunningTime="2026-04-16 22:13:47.821216546 +0000 UTC m=+20.666985460" watchObservedRunningTime="2026-04-16 22:13:47.821506088 +0000 UTC m=+20.667274999" Apr 16 22:13:47.834081 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:47.834048 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-flbzq" podStartSLOduration=4.440625183 podStartE2EDuration="20.834037874s" podCreationTimestamp="2026-04-16 22:13:27 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.326853108 +0000 UTC m=+3.172622006" lastFinishedPulling="2026-04-16 22:13:46.720265794 +0000 UTC m=+19.566034697" observedRunningTime="2026-04-16 22:13:47.83365346 +0000 UTC m=+20.679422371" watchObservedRunningTime="2026-04-16 22:13:47.834037874 +0000 UTC m=+20.679806786" Apr 16 22:13:48.273827 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.273801 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:13:48.657317 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.657224 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:13:48.273823339Z","UUID":"0a706cec-f19b-4d46-b9a1-ae4f2e89401a","Handler":null,"Name":"","Endpoint":""} Apr 16 22:13:48.660607 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.660585 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:13:48.660607 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.660612 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:13:48.767788 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.767760 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:13:48.768227 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.768186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"33d1d333617d4821a426dc5240d2615011405546f599d1ea0f814debf42d394f"} Apr 16 22:13:48.770042 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.770014 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" event={"ID":"74b3275a-d05c-43f4-a4f9-59f1e5fffbed","Type":"ContainerStarted","Data":"6843b0b88a13747b6bdee68537499a73e823e93f9c4e717affa9e3a24bd69a5b"} Apr 16 22:13:48.771618 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.771537 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8r8rq" event={"ID":"c65407a6-d8b9-47f5-ac3f-231ddd09de73","Type":"ContainerStarted","Data":"b6a182f880bc57c0c0a093c36003abb3177df31c96afa35d5a2cceda9bdc78b5"} Apr 16 22:13:48.785276 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.785230 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5c5ns" podStartSLOduration=4.839635673 podStartE2EDuration="21.785214145s" podCreationTimestamp="2026-04-16 22:13:27 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.29911457 +0000 UTC m=+3.144883465" lastFinishedPulling="2026-04-16 22:13:47.244693033 +0000 UTC m=+20.090461937" observedRunningTime="2026-04-16 22:13:47.849564411 +0000 UTC m=+20.695333335" watchObservedRunningTime="2026-04-16 22:13:48.785214145 +0000 UTC m=+21.630983058" Apr 16 22:13:48.785505 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:48.785476 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8r8rq" podStartSLOduration=5.361478475 podStartE2EDuration="21.785468598s" podCreationTimestamp="2026-04-16 22:13:27 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.296256308 +0000 UTC m=+3.142025197" lastFinishedPulling="2026-04-16 22:13:46.720246426 +0000 UTC m=+19.566015320" observedRunningTime="2026-04-16 22:13:48.785087791 +0000 UTC m=+21.630856703" watchObservedRunningTime="2026-04-16 22:13:48.785468598 +0000 UTC m=+21.631237514" Apr 16 22:13:49.461004 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:49.460928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:49.461542 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:49.461034 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:49.461542 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:49.461111 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret podName:c3e33b82-1d36-4e38-ae60-42189b25da6e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:57.461092017 +0000 UTC m=+30.306860922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret") pod "global-pull-secret-syncer-wvd8j" (UID: "c3e33b82-1d36-4e38-ae60-42189b25da6e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:49.672249 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:49.672174 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:49.672249 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:49.672186 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:49.672475 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:49.672312 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:49.672475 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:49.672378 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:49.672475 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:49.672402 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:49.672611 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:49.672490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:49.778188 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:49.778154 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" event={"ID":"74b3275a-d05c-43f4-a4f9-59f1e5fffbed","Type":"ContainerStarted","Data":"92b8c88ef5218cd9e6ad8e019f2b8da6dd3fd165fab02c9a67ee06f1f00bdd85"} Apr 16 22:13:49.811249 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:49.811203 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smnf8" podStartSLOduration=3.907431218 podStartE2EDuration="22.811187899s" podCreationTimestamp="2026-04-16 22:13:27 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.326873015 +0000 UTC m=+3.172641907" lastFinishedPulling="2026-04-16 22:13:49.230629685 +0000 UTC m=+22.076398588" observedRunningTime="2026-04-16 22:13:49.811021447 +0000 UTC m=+22.656790359" watchObservedRunningTime="2026-04-16 22:13:49.811187899 +0000 UTC m=+22.656956812" Apr 16 22:13:50.783812 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:50.783783 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:13:50.784348 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:50.784251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"c7c61478c7d8364661e66f12345cf2f70e11074830f5e5b5ffe751ee9d9ee868"} Apr 16 22:13:51.672448 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:51.672224 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:51.672606 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:51.672224 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:51.672606 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:51.672534 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:51.672715 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:51.672638 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:51.672715 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:51.672234 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:51.672835 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:51.672750 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:52.541607 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.541582 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:52.542095 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.542079 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:52.790393 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.790369 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:13:52.790695 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.790674 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"2f1e95666648be843ff0175863f256a9c8b62a594bce2f258daa4cb051b08e24"} Apr 16 22:13:52.790983 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.790967 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:52.791067 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.790992 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:52.791168 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.791150 2579 scope.go:117] "RemoveContainer" containerID="a2659546568bfbba27456fcddb9ecb38503ffd21e8faa9fa8e1a6b8580b61014" Apr 16 22:13:52.792502 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.792476 2579 generic.go:358] "Generic (PLEG): container finished" podID="bcee1191-76cc-4c41-ad57-b41d75589f20" containerID="4b8c4be1731148f8065cfe62b809fee642e5bb85ab29bf97202443711f78880a" exitCode=0 Apr 16 22:13:52.792599 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.792557 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerDied","Data":"4b8c4be1731148f8065cfe62b809fee642e5bb85ab29bf97202443711f78880a"} Apr 16 22:13:52.806689 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:52.806667 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:53.672547 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.672390 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:53.672972 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.672417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:53.672972 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:53.672617 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:53.672972 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.672422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:53.672972 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:53.672749 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:53.672972 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:53.672849 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:53.797607 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.797581 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:13:53.797952 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.797918 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" event={"ID":"3894fa51-bc91-4390-ab13-ef051552e33a","Type":"ContainerStarted","Data":"8ece45f140ef58c173f3b4cc0b84f9e43bcda1ba3fa0026121534b1c28afac5d"} Apr 16 22:13:53.798139 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.798119 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:53.799749 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.799712 2579 generic.go:358] "Generic (PLEG): container finished" podID="bcee1191-76cc-4c41-ad57-b41d75589f20" containerID="0a4b5102eeeefac85ccfd47981900a14f34f3cac4327bc8523d2a5ff3d8acaba" exitCode=0 Apr 16 22:13:53.799848 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.799776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerDied","Data":"0a4b5102eeeefac85ccfd47981900a14f34f3cac4327bc8523d2a5ff3d8acaba"} Apr 16 22:13:53.811947 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.811930 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:13:53.830285 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.830247 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" podStartSLOduration=10.117095545 podStartE2EDuration="26.830235565s" podCreationTimestamp="2026-04-16 22:13:27 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.327012575 +0000 UTC m=+3.172781466" lastFinishedPulling="2026-04-16 22:13:47.040152596 +0000 UTC m=+19.885921486" observedRunningTime="2026-04-16 22:13:53.829609335 +0000 UTC m=+26.675378247" watchObservedRunningTime="2026-04-16 22:13:53.830235565 +0000 UTC m=+26.676004477" Apr 16 22:13:53.984605 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.984573 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wvd8j"] Apr 16 22:13:53.984737 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.984665 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:53.984776 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:53.984750 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:53.987876 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.987850 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qgfjd"] Apr 16 22:13:53.987987 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.987962 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:53.988090 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:53.988071 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:53.988384 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.988356 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6978g"] Apr 16 22:13:53.988483 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:53.988435 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:53.988529 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:53.988504 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:54.742088 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:54.742066 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:54.742583 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:54.742175 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:13:54.742710 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:54.742692 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fjlpn" Apr 16 22:13:54.803128 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:54.803101 2579 generic.go:358] "Generic (PLEG): container finished" podID="bcee1191-76cc-4c41-ad57-b41d75589f20" containerID="b5a6158f3b658558e8ac7ff410cd5a861c43bb33e08c04e3214138bb1a2240e9" exitCode=0 Apr 16 22:13:54.803235 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:54.803128 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerDied","Data":"b5a6158f3b658558e8ac7ff410cd5a861c43bb33e08c04e3214138bb1a2240e9"} Apr 16 22:13:55.671940 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:55.671908 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:55.671940 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:55.671935 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:55.672158 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:55.671935 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:55.672158 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:55.672015 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:55.672158 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:55.672131 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:55.672272 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:55.672219 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:57.526352 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:57.526321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:57.526836 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:57.526489 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:57.526836 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:57.526566 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret podName:c3e33b82-1d36-4e38-ae60-42189b25da6e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.526545441 +0000 UTC m=+46.372314349 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret") pod "global-pull-secret-syncer-wvd8j" (UID: "c3e33b82-1d36-4e38-ae60-42189b25da6e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:57.673301 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:57.673264 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:57.673450 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:57.673366 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:57.673450 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:57.673417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:57.673570 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:57.673522 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:57.673622 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:57.673568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:57.673669 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:57.673641 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:59.672363 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:59.672277 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:13:59.672783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:59.672277 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:13:59.672783 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:59.672395 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6978g" podUID="e531ad1d-2d55-48e3-afc2-f5404821539c" Apr 16 22:13:59.672783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:59.672422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:13:59.672783 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:59.672500 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:13:59.673334 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:13:59.672564 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvd8j" podUID="c3e33b82-1d36-4e38-ae60-42189b25da6e" Apr 16 22:13:59.938596 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:59.938336 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-39.ec2.internal" event="NodeReady" Apr 16 22:13:59.938746 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:59.938674 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:13:59.978415 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:13:59.978393 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66f4955778-rrcqf"] Apr 16 22:14:00.008361 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.008336 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rpbsg"] Apr 16 22:14:00.008528 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.008507 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.011132 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.011103 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:14:00.011294 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.011102 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:14:00.011294 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.011188 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2kcsr\"" Apr 16 22:14:00.011525 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.011505 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:14:00.017011 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.016834 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:14:00.023800 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.023778 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sfffv"] Apr 16 22:14:00.023956 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.023942 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.026739 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.026706 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:00.026834 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.026707 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:00.026834 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.026810 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6fpb6\"" Apr 16 22:14:00.047830 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.047802 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66f4955778-rrcqf"] Apr 16 22:14:00.047942 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.047839 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rpbsg"] Apr 16 22:14:00.047942 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.047854 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sfffv"] Apr 16 22:14:00.048044 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.047966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:00.050559 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.050534 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:00.050644 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.050570 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:00.050695 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.050654 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bd46\"" Apr 16 22:14:00.050777 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.050763 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:00.144540 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1e036c3-8dac-4562-9015-62e7e9f36238-ca-trust-extracted\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.144540 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144540 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-certificates\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.144700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144561 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8wps\" (UniqueName: \"kubernetes.io/projected/ed1be888-8420-4861-992a-ffd27fc02a14-kube-api-access-l8wps\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:00.144700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-tmp-dir\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.144700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144664 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-trusted-ca\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.144700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-installation-pull-secrets\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.144918 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144720 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-bound-sa-token\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.144918 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-image-registry-private-configuration\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.144918 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.144918 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144848 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpt7g\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-kube-api-access-zpt7g\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.144918 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-config-volume\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.144918 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144906 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.145109 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144939 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:00.145109 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.144955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwxwv\" (UniqueName: \"kubernetes.io/projected/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-kube-api-access-wwxwv\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.245436 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1e036c3-8dac-4562-9015-62e7e9f36238-ca-trust-extracted\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245529 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-certificates\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245567 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245552 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wps\" (UniqueName: \"kubernetes.io/projected/ed1be888-8420-4861-992a-ffd27fc02a14-kube-api-access-l8wps\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:00.245629 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-tmp-dir\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.245676 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245663 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-trusted-ca\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245717 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245695 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-installation-pull-secrets\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245718 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-bound-sa-token\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245765 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-image-registry-private-configuration\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245824 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpt7g\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-kube-api-access-zpt7g\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1e036c3-8dac-4562-9015-62e7e9f36238-ca-trust-extracted\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.245878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245854 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-config-volume\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.245878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.246083 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:00.246083 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.245928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwxwv\" (UniqueName: \"kubernetes.io/projected/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-kube-api-access-wwxwv\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.246083 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.246038 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-tmp-dir\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.246083 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.246049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-certificates\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.246240 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.246149 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:00.246240 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.246166 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:14:00.246240 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.246215 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:00.246240 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.246227 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:00.746207757 +0000 UTC m=+33.591976710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:14:00.246410 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.246264 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:00.746247226 +0000 UTC m=+33.592016135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:14:00.246410 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.246316 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:00.246410 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.246360 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:00.746345953 +0000 UTC m=+33.592114866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:14:00.246506 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.246485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-trusted-ca\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.246506 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.246489 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-config-volume\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.250742 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.250714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-installation-pull-secrets\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.250817 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.250776 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-image-registry-private-configuration\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.256664 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.256646 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8wps\" (UniqueName: \"kubernetes.io/projected/ed1be888-8420-4861-992a-ffd27fc02a14-kube-api-access-l8wps\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:00.257108 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.257094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwxwv\" (UniqueName: \"kubernetes.io/projected/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-kube-api-access-wwxwv\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.264672 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.264653 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpt7g\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-kube-api-access-zpt7g\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.264752 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.264699 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-bound-sa-token\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.748974 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.748926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.749011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.749050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.749080 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.749103 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.749138 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.749165 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.749144899 +0000 UTC m=+34.594913808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.749181 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.749175266 +0000 UTC m=+34.594944162 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.749251 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:00.749386 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:00.749313 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.749297267 +0000 UTC m=+34.595066175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:14:00.816150 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:00.816120 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerStarted","Data":"56dbe940005336c596c22ae298b1aee381ee30c8259769d77ff0ec02387536f5"} Apr 16 22:14:01.453429 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.453393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:14:01.453635 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.453537 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:01.453635 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.453598 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:33.453584692 +0000 UTC m=+66.299353586 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:01.554013 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.553982 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:14:01.554162 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.554144 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:01.554201 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.554166 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:01.554201 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.554176 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9kfzr for pod openshift-network-diagnostics/network-check-target-6978g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:01.554264 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.554222 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr podName:e531ad1d-2d55-48e3-afc2-f5404821539c nodeName:}" failed. No retries permitted until 2026-04-16 22:14:33.554209784 +0000 UTC m=+66.399978675 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9kfzr" (UniqueName: "kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr") pod "network-check-target-6978g" (UID: "e531ad1d-2d55-48e3-afc2-f5404821539c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:01.672534 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.672506 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:14:01.672659 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.672608 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:14:01.672659 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.672633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:14:01.675361 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.675341 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:14:01.675486 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.675341 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:01.675486 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.675341 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:01.676607 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.676588 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gj5qm\"" Apr 16 22:14:01.676715 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.676589 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lt98\"" Apr 16 22:14:01.676715 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.676654 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:01.756211 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.756188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.756327 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.756356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.756381 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:03.756367215 +0000 UTC m=+36.602136106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.756414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.756429 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.756439 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.756476 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.756495 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:03.756474994 +0000 UTC m=+36.602243883 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:14:01.756557 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:01.756507 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:03.756501634 +0000 UTC m=+36.602270527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:14:01.820037 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.820006 2579 generic.go:358] "Generic (PLEG): container finished" podID="bcee1191-76cc-4c41-ad57-b41d75589f20" containerID="56dbe940005336c596c22ae298b1aee381ee30c8259769d77ff0ec02387536f5" exitCode=0 Apr 16 22:14:01.820242 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:01.820050 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerDied","Data":"56dbe940005336c596c22ae298b1aee381ee30c8259769d77ff0ec02387536f5"} Apr 16 22:14:02.824282 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:02.824252 2579 generic.go:358] "Generic (PLEG): container finished" podID="bcee1191-76cc-4c41-ad57-b41d75589f20" containerID="03c92498d2446d207587a99103607af759f5ff8e1c0e1895a7912f1ad465b797" exitCode=0 Apr 16 22:14:02.824766 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:02.824289 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerDied","Data":"03c92498d2446d207587a99103607af759f5ff8e1c0e1895a7912f1ad465b797"} Apr 16 22:14:03.772307 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:03.772274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:03.772307 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:03.772317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:03.772489 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:03.772338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:03.772489 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:03.772417 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:03.772489 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:03.772430 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:03.772489 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:03.772460 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.772448012 +0000 UTC m=+40.618216902 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:14:03.772489 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:03.772481 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.772468648 +0000 UTC m=+40.618237538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:14:03.772646 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:03.772509 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:03.772646 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:03.772530 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:14:03.772646 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:03.772573 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.77256022 +0000 UTC m=+40.618329110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:14:03.830803 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:03.830719 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knsc9" event={"ID":"bcee1191-76cc-4c41-ad57-b41d75589f20","Type":"ContainerStarted","Data":"fff4904c879fb64525164c3e1c16d6135a9425495da8c029dafc4fcfc7bf7efe"} Apr 16 22:14:03.866066 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:03.866018 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-knsc9" podStartSLOduration=6.644304348 podStartE2EDuration="36.866004797s" podCreationTimestamp="2026-04-16 22:13:27 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.302699093 +0000 UTC m=+3.148467996" lastFinishedPulling="2026-04-16 22:14:00.524399556 +0000 UTC m=+33.370168445" observedRunningTime="2026-04-16 22:14:03.864299382 +0000 UTC m=+36.710068293" watchObservedRunningTime="2026-04-16 22:14:03.866004797 +0000 UTC m=+36.711773737" Apr 16 22:14:07.801536 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:07.801359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:07.801547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:07.801568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:07.801509 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:07.801616 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:07.801657 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:07.801678 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:07.801687 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:15.801667409 +0000 UTC m=+48.647436321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:07.801712 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:15.801702111 +0000 UTC m=+48.647471017 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:14:07.801938 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:07.801757 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:15.801746071 +0000 UTC m=+48.647514979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:14:13.541444 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:13.541408 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:14:13.544502 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:13.544484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3e33b82-1d36-4e38-ae60-42189b25da6e-original-pull-secret\") pod \"global-pull-secret-syncer-wvd8j\" (UID: \"c3e33b82-1d36-4e38-ae60-42189b25da6e\") " pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:14:13.692455 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:13.692431 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvd8j" Apr 16 22:14:13.870157 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:13.870129 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wvd8j"] Apr 16 22:14:13.873412 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:14:13.873387 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e33b82_1d36_4e38_ae60_42189b25da6e.slice/crio-15d048a97c5ab8ee63285135793e10b61eacc2f517d189fa1df6c05d4e19083e WatchSource:0}: Error finding container 15d048a97c5ab8ee63285135793e10b61eacc2f517d189fa1df6c05d4e19083e: Status 404 returned error can't find the container with id 15d048a97c5ab8ee63285135793e10b61eacc2f517d189fa1df6c05d4e19083e Apr 16 22:14:14.849913 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:14.849874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wvd8j" event={"ID":"c3e33b82-1d36-4e38-ae60-42189b25da6e","Type":"ContainerStarted","Data":"15d048a97c5ab8ee63285135793e10b61eacc2f517d189fa1df6c05d4e19083e"} Apr 16 22:14:15.858061 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:15.858021 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:15.858071 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:15.858101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:15.858187 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:15.858209 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:15.858191 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:15.858275 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:31.858254096 +0000 UTC m=+64.704022992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:15.858271 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:15.858358 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:31.858340243 +0000 UTC m=+64.704109137 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:14:15.858485 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:15.858412 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:31.858397277 +0000 UTC m=+64.704166182 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:14:18.858971 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:18.858936 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wvd8j" event={"ID":"c3e33b82-1d36-4e38-ae60-42189b25da6e","Type":"ContainerStarted","Data":"4e2b1b3d1c6dbedc2f94e1e534b53509023a91f71b7d9bd58c9a357d8e548873"} Apr 16 22:14:18.875593 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:18.875549 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wvd8j" podStartSLOduration=33.997578373 podStartE2EDuration="37.875536354s" podCreationTimestamp="2026-04-16 22:13:41 +0000 UTC" firstStartedPulling="2026-04-16 22:14:13.881650735 +0000 UTC m=+46.727419625" lastFinishedPulling="2026-04-16 22:14:17.759608714 +0000 UTC m=+50.605377606" observedRunningTime="2026-04-16 22:14:18.874960827 +0000 UTC m=+51.720729753" watchObservedRunningTime="2026-04-16 22:14:18.875536354 +0000 UTC m=+51.721305266" Apr 16 22:14:25.814952 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:25.814928 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4npz" Apr 16 22:14:31.867604 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:31.867571 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:14:31.867604 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:31.867609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:14:31.868107 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:31.867655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:14:31.868107 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:31.867706 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:31.868107 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:31.867750 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:31.868107 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:31.867762 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:14:31.868107 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:31.867783 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:31.868107 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:31.867788 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:03.867773614 +0000 UTC m=+96.713542509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:14:31.868107 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:31.867851 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:03.867838214 +0000 UTC m=+96.713607108 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:14:31.868107 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:31.867862 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:03.867856285 +0000 UTC m=+96.713625178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:14:33.478451 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.478410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:14:33.481097 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.481080 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:33.489136 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:33.489117 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:33.489192 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:14:33.489173 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:37.489158488 +0000 UTC m=+130.334927378 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : secret "metrics-daemon-secret" not found Apr 16 22:14:33.579323 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.579286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:14:33.581935 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.581919 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:33.592489 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.592471 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:33.603037 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.603016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kfzr\" (UniqueName: \"kubernetes.io/projected/e531ad1d-2d55-48e3-afc2-f5404821539c-kube-api-access-9kfzr\") pod \"network-check-target-6978g\" (UID: \"e531ad1d-2d55-48e3-afc2-f5404821539c\") " pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:14:33.785209 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.785138 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lt98\"" Apr 16 22:14:33.793262 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.793244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:14:33.901029 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:33.901000 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6978g"] Apr 16 22:14:33.905076 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:14:33.905047 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode531ad1d_2d55_48e3_afc2_f5404821539c.slice/crio-7f642b9221d41eadd37dbe120d2bde29584a6e414962b64281fa489f4844f7d4 WatchSource:0}: Error finding container 7f642b9221d41eadd37dbe120d2bde29584a6e414962b64281fa489f4844f7d4: Status 404 returned error can't find the container with id 7f642b9221d41eadd37dbe120d2bde29584a6e414962b64281fa489f4844f7d4 Apr 16 22:14:34.891275 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:34.891230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6978g" event={"ID":"e531ad1d-2d55-48e3-afc2-f5404821539c","Type":"ContainerStarted","Data":"7f642b9221d41eadd37dbe120d2bde29584a6e414962b64281fa489f4844f7d4"} Apr 16 22:14:36.896428 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:36.896347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6978g" event={"ID":"e531ad1d-2d55-48e3-afc2-f5404821539c","Type":"ContainerStarted","Data":"0abc79a9c23b4430a15f92e8b35bae3462654ac868a4815d18f000ea4251bb99"} Apr 16 22:14:36.896788 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:14:36.896496 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:15:03.889764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:03.889653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:15:03.889764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:03.889705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:15:03.889764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:03.889752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:15:03.890267 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:03.889824 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:03.890267 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:03.889846 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:15:03.890267 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:03.889892 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:03.890267 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:03.889921 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.889903552 +0000 UTC m=+160.735672461 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:15:03.890267 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:03.889956 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.889939574 +0000 UTC m=+160.735708482 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:15:03.890267 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:03.889894 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:03.890267 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:03.890000 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.889990699 +0000 UTC m=+160.735759589 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:15:07.900358 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:07.900331 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6978g" Apr 16 22:15:07.918932 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:07.918889 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6978g" podStartSLOduration=97.336173657 podStartE2EDuration="1m39.918868344s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:14:33.906875684 +0000 UTC m=+66.752644575" lastFinishedPulling="2026-04-16 22:14:36.489570373 +0000 UTC m=+69.335339262" observedRunningTime="2026-04-16 22:14:36.911207586 +0000 UTC m=+69.756976495" watchObservedRunningTime="2026-04-16 22:15:07.918868344 +0000 UTC m=+100.764637259" Apr 16 22:15:37.517830 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:37.517793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:15:37.518304 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:37.517912 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:37.518304 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:37.517966 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs podName:a2d7d39e-d19f-4a6e-8107-593903f29181 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:39.517951654 +0000 UTC m=+252.363720544 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs") pod "network-metrics-daemon-qgfjd" (UID: "a2d7d39e-d19f-4a6e-8107-593903f29181") : secret "metrics-daemon-secret" not found Apr 16 22:15:55.523834 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.523798 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92"] Apr 16 22:15:55.526796 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.526775 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92" Apr 16 22:15:55.529510 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.529488 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:55.529510 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.529504 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mfv5k\"" Apr 16 22:15:55.530596 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.530581 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:55.533076 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.532944 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-87gkq"] Apr 16 22:15:55.533434 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.533407 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf7vx\" (UniqueName: \"kubernetes.io/projected/1326ec62-5db0-4705-a851-056172e81fd1-kube-api-access-pf7vx\") pod \"volume-data-source-validator-7c6cbb6c87-rwd92\" (UID: \"1326ec62-5db0-4705-a851-056172e81fd1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92" Apr 16 22:15:55.535595 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.535582 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.537600 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.537582 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92"] Apr 16 22:15:55.539612 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.539592 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 22:15:55.540026 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.540007 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:15:55.540119 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.540103 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-nwhms\"" Apr 16 22:15:55.540119 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.540106 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 22:15:55.540241 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.540149 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:15:55.546222 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.546185 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-87gkq"] Apr 16 22:15:55.548862 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.548838 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 22:15:55.624736 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.624706 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62"] Apr 16 22:15:55.627446 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.627432 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.629961 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.629939 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-65fn8\"" Apr 16 22:15:55.630075 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.629967 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:55.630075 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.629987 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 22:15:55.630181 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.630147 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 22:15:55.630181 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.630157 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:55.631541 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.631520 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2"] Apr 16 22:15:55.633808 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.633639 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d72e208-6678-427c-826b-098451ce245c-config\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.633808 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.633673 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7db6d6a8-5304-4b41-87c9-a4f433031f6e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.633808 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.633697 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7db6d6a8-5304-4b41-87c9-a4f433031f6e-service-ca-bundle\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.633974 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.633820 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d72e208-6678-427c-826b-098451ce245c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.633974 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.633853 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7db6d6a8-5304-4b41-87c9-a4f433031f6e-snapshots\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.634046 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.634024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf7vx\" (UniqueName: \"kubernetes.io/projected/1326ec62-5db0-4705-a851-056172e81fd1-kube-api-access-pf7vx\") pod \"volume-data-source-validator-7c6cbb6c87-rwd92\" (UID: \"1326ec62-5db0-4705-a851-056172e81fd1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92" Apr 16 22:15:55.634105 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.634088 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7db6d6a8-5304-4b41-87c9-a4f433031f6e-tmp\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.634142 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.634123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gtd\" (UniqueName: \"kubernetes.io/projected/7db6d6a8-5304-4b41-87c9-a4f433031f6e-kube-api-access-c6gtd\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.634191 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.634164 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7db6d6a8-5304-4b41-87c9-a4f433031f6e-serving-cert\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.634280 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.634261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72rfh\" (UniqueName: \"kubernetes.io/projected/3d72e208-6678-427c-826b-098451ce245c-kube-api-access-72rfh\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.634946 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.634928 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs"] Apr 16 22:15:55.635075 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.635065 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.637187 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.637167 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-sxgmc\"" Apr 16 22:15:55.637456 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.637429 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:15:55.637533 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.637439 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 22:15:55.637660 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.637628 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-s2fv5"] Apr 16 22:15:55.637774 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.637749 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:55.637844 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.637792 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:15:55.639363 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.639346 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 22:15:55.640397 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.640380 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62"] Apr 16 22:15:55.640492 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.640478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.640562 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.640545 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-76zdg\"" Apr 16 22:15:55.640662 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.640478 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:55.640847 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.640481 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:55.641030 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.641005 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 22:15:55.642818 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.642796 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jtvx4\"" Apr 16 22:15:55.643256 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.643181 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 22:15:55.643577 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.643560 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 22:15:55.643665 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.643596 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:55.643665 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.643646 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:55.653801 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.653719 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs"] Apr 16 22:15:55.654049 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.654033 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2"] Apr 16 22:15:55.654869 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.654851 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-s2fv5"] Apr 16 22:15:55.655192 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.655172 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 22:15:55.666995 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.666976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf7vx\" (UniqueName: \"kubernetes.io/projected/1326ec62-5db0-4705-a851-056172e81fd1-kube-api-access-pf7vx\") pod \"volume-data-source-validator-7c6cbb6c87-rwd92\" (UID: \"1326ec62-5db0-4705-a851-056172e81fd1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92" Apr 16 22:15:55.735215 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735191 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72rfh\" (UniqueName: \"kubernetes.io/projected/3d72e208-6678-427c-826b-098451ce245c-kube-api-access-72rfh\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.735330 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d72e208-6678-427c-826b-098451ce245c-config\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.735330 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7db6d6a8-5304-4b41-87c9-a4f433031f6e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.735330 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7db6d6a8-5304-4b41-87c9-a4f433031f6e-service-ca-bundle\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.735471 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735334 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d72e208-6678-427c-826b-098451ce245c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.735471 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gtd\" (UniqueName: \"kubernetes.io/projected/7db6d6a8-5304-4b41-87c9-a4f433031f6e-kube-api-access-c6gtd\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.735471 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqjmx\" (UniqueName: \"kubernetes.io/projected/e3b13759-2e6b-457f-a585-d1648b1b4543-kube-api-access-kqjmx\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:55.735619 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735604 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a242c550-213a-4a82-8bbb-01a37bbc13c5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.735673 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7db6d6a8-5304-4b41-87c9-a4f433031f6e-tmp\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.735755 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735693 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.735816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d72e208-6678-427c-826b-098451ce245c-config\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.735816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735773 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7db6d6a8-5304-4b41-87c9-a4f433031f6e-snapshots\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.735816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c17f94-3e64-4703-a384-9593c55d048f-serving-cert\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.735968 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:55.735968 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735899 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdnhv\" (UniqueName: \"kubernetes.io/projected/42c17f94-3e64-4703-a384-9593c55d048f-kube-api-access-kdnhv\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.735968 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96kb5\" (UniqueName: \"kubernetes.io/projected/a242c550-213a-4a82-8bbb-01a37bbc13c5-kube-api-access-96kb5\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.735968 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735962 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c17f94-3e64-4703-a384-9593c55d048f-config\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.736159 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.735980 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7db6d6a8-5304-4b41-87c9-a4f433031f6e-tmp\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.736159 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.736032 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7db6d6a8-5304-4b41-87c9-a4f433031f6e-serving-cert\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.736159 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.736071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42c17f94-3e64-4703-a384-9593c55d048f-trusted-ca\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.736159 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.736136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7db6d6a8-5304-4b41-87c9-a4f433031f6e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.736406 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.736350 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7db6d6a8-5304-4b41-87c9-a4f433031f6e-snapshots\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.736541 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.736517 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7db6d6a8-5304-4b41-87c9-a4f433031f6e-service-ca-bundle\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.737981 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.737960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d72e208-6678-427c-826b-098451ce245c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.738171 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.738152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7db6d6a8-5304-4b41-87c9-a4f433031f6e-serving-cert\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.743816 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.743796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72rfh\" (UniqueName: \"kubernetes.io/projected/3d72e208-6678-427c-826b-098451ce245c-kube-api-access-72rfh\") pod \"service-ca-operator-d6fc45fc5-nbj62\" (UID: \"3d72e208-6678-427c-826b-098451ce245c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.744610 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.744592 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gtd\" (UniqueName: \"kubernetes.io/projected/7db6d6a8-5304-4b41-87c9-a4f433031f6e-kube-api-access-c6gtd\") pod \"insights-operator-585dfdc468-87gkq\" (UID: \"7db6d6a8-5304-4b41-87c9-a4f433031f6e\") " pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.835058 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.835004 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92" Apr 16 22:15:55.836643 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqjmx\" (UniqueName: \"kubernetes.io/projected/e3b13759-2e6b-457f-a585-d1648b1b4543-kube-api-access-kqjmx\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:55.836764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a242c550-213a-4a82-8bbb-01a37bbc13c5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.836764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836675 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.836764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c17f94-3e64-4703-a384-9593c55d048f-serving-cert\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.836764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:55.836990 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdnhv\" (UniqueName: \"kubernetes.io/projected/42c17f94-3e64-4703-a384-9593c55d048f-kube-api-access-kdnhv\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.836990 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836792 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96kb5\" (UniqueName: \"kubernetes.io/projected/a242c550-213a-4a82-8bbb-01a37bbc13c5-kube-api-access-96kb5\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.836990 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:55.836805 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:55.836990 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836817 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c17f94-3e64-4703-a384-9593c55d048f-config\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.836990 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.836847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42c17f94-3e64-4703-a384-9593c55d048f-trusted-ca\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.836990 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:55.836891 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls podName:a242c550-213a-4a82-8bbb-01a37bbc13c5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.336871253 +0000 UTC m=+149.182640162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5c4w2" (UID: "a242c550-213a-4a82-8bbb-01a37bbc13c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:55.836990 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:55.836952 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:55.837331 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:55.837052 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls podName:e3b13759-2e6b-457f-a585-d1648b1b4543 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.337032584 +0000 UTC m=+149.182801492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gscs" (UID: "e3b13759-2e6b-457f-a585-d1648b1b4543") : secret "samples-operator-tls" not found Apr 16 22:15:55.837519 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.837499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c17f94-3e64-4703-a384-9593c55d048f-config\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.837909 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.837890 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42c17f94-3e64-4703-a384-9593c55d048f-trusted-ca\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.838239 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.838220 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a242c550-213a-4a82-8bbb-01a37bbc13c5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.838993 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.838977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c17f94-3e64-4703-a384-9593c55d048f-serving-cert\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.843948 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.843928 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-87gkq" Apr 16 22:15:55.846760 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.846719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdnhv\" (UniqueName: \"kubernetes.io/projected/42c17f94-3e64-4703-a384-9593c55d048f-kube-api-access-kdnhv\") pod \"console-operator-9d4b6777b-s2fv5\" (UID: \"42c17f94-3e64-4703-a384-9593c55d048f\") " pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.846916 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.846789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqjmx\" (UniqueName: \"kubernetes.io/projected/e3b13759-2e6b-457f-a585-d1648b1b4543-kube-api-access-kqjmx\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:55.846916 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.846858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96kb5\" (UniqueName: \"kubernetes.io/projected/a242c550-213a-4a82-8bbb-01a37bbc13c5-kube-api-access-96kb5\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:55.937165 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.937131 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" Apr 16 22:15:55.952211 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.952184 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92"] Apr 16 22:15:55.955488 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:15:55.955461 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1326ec62_5db0_4705_a851_056172e81fd1.slice/crio-da685ba322586c3b1ec90a26573e8d7b23c21419530cf8d1b57fc8e2e3c325e4 WatchSource:0}: Error finding container da685ba322586c3b1ec90a26573e8d7b23c21419530cf8d1b57fc8e2e3c325e4: Status 404 returned error can't find the container with id da685ba322586c3b1ec90a26573e8d7b23c21419530cf8d1b57fc8e2e3c325e4 Apr 16 22:15:55.960800 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.960779 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:15:55.970501 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:55.970480 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-87gkq"] Apr 16 22:15:55.973385 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:15:55.973354 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db6d6a8_5304_4b41_87c9_a4f433031f6e.slice/crio-ed1a791e2f47870f28c727da4228a5422d3ef65f17005acab6eab2b84ebcf930 WatchSource:0}: Error finding container ed1a791e2f47870f28c727da4228a5422d3ef65f17005acab6eab2b84ebcf930: Status 404 returned error can't find the container with id ed1a791e2f47870f28c727da4228a5422d3ef65f17005acab6eab2b84ebcf930 Apr 16 22:15:56.051775 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:56.051715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-87gkq" event={"ID":"7db6d6a8-5304-4b41-87c9-a4f433031f6e","Type":"ContainerStarted","Data":"ed1a791e2f47870f28c727da4228a5422d3ef65f17005acab6eab2b84ebcf930"} Apr 16 22:15:56.052758 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:56.052711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92" event={"ID":"1326ec62-5db0-4705-a851-056172e81fd1","Type":"ContainerStarted","Data":"da685ba322586c3b1ec90a26573e8d7b23c21419530cf8d1b57fc8e2e3c325e4"} Apr 16 22:15:56.060334 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:56.060309 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62"] Apr 16 22:15:56.063381 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:15:56.063361 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d72e208_6678_427c_826b_098451ce245c.slice/crio-0cf7fb2cd173f3db20ecb655bfc67c1cefd917eee21d5d1f1591a26ee356d821 WatchSource:0}: Error finding container 0cf7fb2cd173f3db20ecb655bfc67c1cefd917eee21d5d1f1591a26ee356d821: Status 404 returned error can't find the container with id 0cf7fb2cd173f3db20ecb655bfc67c1cefd917eee21d5d1f1591a26ee356d821 Apr 16 22:15:56.084524 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:56.084499 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-s2fv5"] Apr 16 22:15:56.087809 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:15:56.087777 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c17f94_3e64_4703_a384_9593c55d048f.slice/crio-353585385a95ed26517845dc9e001b35ca1be285610ee5bc66bf1c0ab3230788 WatchSource:0}: Error finding container 353585385a95ed26517845dc9e001b35ca1be285610ee5bc66bf1c0ab3230788: Status 404 returned error can't find the container with id 353585385a95ed26517845dc9e001b35ca1be285610ee5bc66bf1c0ab3230788 Apr 16 22:15:56.339591 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:56.339513 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:56.339591 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:56.339570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:56.339817 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:56.339702 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:56.339817 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:56.339705 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:56.339817 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:56.339789 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls podName:e3b13759-2e6b-457f-a585-d1648b1b4543 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:57.339769012 +0000 UTC m=+150.185537917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gscs" (UID: "e3b13759-2e6b-457f-a585-d1648b1b4543") : secret "samples-operator-tls" not found Apr 16 22:15:56.339817 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:56.339807 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls podName:a242c550-213a-4a82-8bbb-01a37bbc13c5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:57.339798295 +0000 UTC m=+150.185567201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5c4w2" (UID: "a242c550-213a-4a82-8bbb-01a37bbc13c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:57.056954 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:57.056875 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" event={"ID":"42c17f94-3e64-4703-a384-9593c55d048f","Type":"ContainerStarted","Data":"353585385a95ed26517845dc9e001b35ca1be285610ee5bc66bf1c0ab3230788"} Apr 16 22:15:57.059338 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:57.059284 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" event={"ID":"3d72e208-6678-427c-826b-098451ce245c","Type":"ContainerStarted","Data":"0cf7fb2cd173f3db20ecb655bfc67c1cefd917eee21d5d1f1591a26ee356d821"} Apr 16 22:15:57.349798 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:57.348994 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:57.349798 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:57.349072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:57.349798 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:57.349286 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:57.349798 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:57.349343 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls podName:e3b13759-2e6b-457f-a585-d1648b1b4543 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:59.349323819 +0000 UTC m=+152.195092726 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gscs" (UID: "e3b13759-2e6b-457f-a585-d1648b1b4543") : secret "samples-operator-tls" not found Apr 16 22:15:57.349798 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:57.349709 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:57.349798 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:57.349775 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls podName:a242c550-213a-4a82-8bbb-01a37bbc13c5 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:59.349759891 +0000 UTC m=+152.195528851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5c4w2" (UID: "a242c550-213a-4a82-8bbb-01a37bbc13c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:58.062695 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:58.062654 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92" event={"ID":"1326ec62-5db0-4705-a851-056172e81fd1","Type":"ContainerStarted","Data":"9daaad5388f4c217c196cd7f2d175e65b842a0cba5283b32ff94a3b9e3c56138"} Apr 16 22:15:58.078084 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:58.078030 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rwd92" podStartSLOduration=1.779521231 podStartE2EDuration="3.078014337s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.957305099 +0000 UTC m=+148.803073989" lastFinishedPulling="2026-04-16 22:15:57.255798203 +0000 UTC m=+150.101567095" observedRunningTime="2026-04-16 22:15:58.077871183 +0000 UTC m=+150.923640094" watchObservedRunningTime="2026-04-16 22:15:58.078014337 +0000 UTC m=+150.923783250" Apr 16 22:15:59.067019 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.066987 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-87gkq" event={"ID":"7db6d6a8-5304-4b41-87c9-a4f433031f6e","Type":"ContainerStarted","Data":"11cd8f539996c3cdfee779660442f868df3178fefa022ccbfc4a613138f8571e"} Apr 16 22:15:59.068451 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.068429 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/0.log" Apr 16 22:15:59.068546 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.068471 2579 generic.go:358] "Generic (PLEG): container finished" podID="42c17f94-3e64-4703-a384-9593c55d048f" containerID="1b7bea15e9b6ae36e78bdae295b381028ce29632f56fe6f57d90a7331d053c7e" exitCode=255 Apr 16 22:15:59.068593 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.068553 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" event={"ID":"42c17f94-3e64-4703-a384-9593c55d048f","Type":"ContainerDied","Data":"1b7bea15e9b6ae36e78bdae295b381028ce29632f56fe6f57d90a7331d053c7e"} Apr 16 22:15:59.068740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.068707 2579 scope.go:117] "RemoveContainer" containerID="1b7bea15e9b6ae36e78bdae295b381028ce29632f56fe6f57d90a7331d053c7e" Apr 16 22:15:59.069952 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.069926 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" event={"ID":"3d72e208-6678-427c-826b-098451ce245c","Type":"ContainerStarted","Data":"d3334edb5c3db134ffd9da7bf592ad497a4aad207e31c8fc6036e1a03d191365"} Apr 16 22:15:59.083689 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.083650 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-87gkq" podStartSLOduration=1.129712682 podStartE2EDuration="4.083638918s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.975205387 +0000 UTC m=+148.820974278" lastFinishedPulling="2026-04-16 22:15:58.929131621 +0000 UTC m=+151.774900514" observedRunningTime="2026-04-16 22:15:59.082702319 +0000 UTC m=+151.928471231" watchObservedRunningTime="2026-04-16 22:15:59.083638918 +0000 UTC m=+151.929407853" Apr 16 22:15:59.122067 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.121902 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" podStartSLOduration=1.251074182 podStartE2EDuration="4.121887188s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="2026-04-16 22:15:56.064940688 +0000 UTC m=+148.910709593" lastFinishedPulling="2026-04-16 22:15:58.935753707 +0000 UTC m=+151.781522599" observedRunningTime="2026-04-16 22:15:59.121865332 +0000 UTC m=+151.967634244" watchObservedRunningTime="2026-04-16 22:15:59.121887188 +0000 UTC m=+151.967656102" Apr 16 22:15:59.368989 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.368936 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:15:59.369151 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:15:59.369037 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:15:59.369151 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:59.369097 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:59.369226 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:59.369165 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:59.369226 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:59.369180 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls podName:a242c550-213a-4a82-8bbb-01a37bbc13c5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.369157507 +0000 UTC m=+156.214926398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5c4w2" (UID: "a242c550-213a-4a82-8bbb-01a37bbc13c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:59.369226 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:15:59.369213 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls podName:e3b13759-2e6b-457f-a585-d1648b1b4543 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.369200518 +0000 UTC m=+156.214969408 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gscs" (UID: "e3b13759-2e6b-457f-a585-d1648b1b4543") : secret "samples-operator-tls" not found Apr 16 22:16:00.078216 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.078189 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:16:00.078617 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.078533 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/0.log" Apr 16 22:16:00.078617 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.078564 2579 generic.go:358] "Generic (PLEG): container finished" podID="42c17f94-3e64-4703-a384-9593c55d048f" containerID="4898d325a390cb9bcc33339f5e45d8cbec4e04f7ec1ed61a17d313a0f7b723b2" exitCode=255 Apr 16 22:16:00.078721 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.078649 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" event={"ID":"42c17f94-3e64-4703-a384-9593c55d048f","Type":"ContainerDied","Data":"4898d325a390cb9bcc33339f5e45d8cbec4e04f7ec1ed61a17d313a0f7b723b2"} Apr 16 22:16:00.078721 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.078695 2579 scope.go:117] "RemoveContainer" containerID="1b7bea15e9b6ae36e78bdae295b381028ce29632f56fe6f57d90a7331d053c7e" Apr 16 22:16:00.078924 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.078903 2579 scope.go:117] "RemoveContainer" containerID="4898d325a390cb9bcc33339f5e45d8cbec4e04f7ec1ed61a17d313a0f7b723b2" Apr 16 22:16:00.079081 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:00.079063 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-s2fv5_openshift-console-operator(42c17f94-3e64-4703-a384-9593c55d048f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" podUID="42c17f94-3e64-4703-a384-9593c55d048f" Apr 16 22:16:00.555748 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.555684 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs"] Apr 16 22:16:00.559536 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.559514 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:00.563213 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.563194 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 22:16:00.563310 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.563222 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 22:16:00.563368 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.563353 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rlz85\"" Apr 16 22:16:00.574002 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.573982 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs"] Apr 16 22:16:00.581281 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.581261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:00.581369 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.581294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35b5e138-1f3a-4054-9071-6e12676b9b25-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:00.681612 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.681578 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:00.681761 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.681625 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35b5e138-1f3a-4054-9071-6e12676b9b25-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:00.681839 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:00.681761 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:00.681895 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:00.681842 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert podName:35b5e138-1f3a-4054-9071-6e12676b9b25 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:01.181822132 +0000 UTC m=+154.027591034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6z6fs" (UID: "35b5e138-1f3a-4054-9071-6e12676b9b25") : secret "networking-console-plugin-cert" not found Apr 16 22:16:00.682744 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:00.682711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35b5e138-1f3a-4054-9071-6e12676b9b25-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:01.083191 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:01.083167 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:16:01.083749 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:01.083579 2579 scope.go:117] "RemoveContainer" containerID="4898d325a390cb9bcc33339f5e45d8cbec4e04f7ec1ed61a17d313a0f7b723b2" Apr 16 22:16:01.083849 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:01.083823 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-s2fv5_openshift-console-operator(42c17f94-3e64-4703-a384-9593c55d048f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" podUID="42c17f94-3e64-4703-a384-9593c55d048f" Apr 16 22:16:01.187053 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:01.187025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:01.187193 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:01.187154 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:01.187268 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:01.187223 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert podName:35b5e138-1f3a-4054-9071-6e12676b9b25 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:02.187202467 +0000 UTC m=+155.032971360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6z6fs" (UID: "35b5e138-1f3a-4054-9071-6e12676b9b25") : secret "networking-console-plugin-cert" not found Apr 16 22:16:02.196140 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.196107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:02.196581 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:02.196233 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:02.196581 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:02.196315 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert podName:35b5e138-1f3a-4054-9071-6e12676b9b25 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:04.196294156 +0000 UTC m=+157.042063069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6z6fs" (UID: "35b5e138-1f3a-4054-9071-6e12676b9b25") : secret "networking-console-plugin-cert" not found Apr 16 22:16:02.760331 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.760294 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g5cv2"] Apr 16 22:16:02.763672 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.763650 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.767462 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.767445 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-kg5qv\"" Apr 16 22:16:02.767920 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.767905 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 22:16:02.768565 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.768539 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 22:16:02.768649 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.768622 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 22:16:02.768751 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.768647 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 22:16:02.780751 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.780715 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g5cv2"] Apr 16 22:16:02.800533 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.800511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52bmx\" (UniqueName: \"kubernetes.io/projected/2ee72b8e-0396-4204-8f3d-185939377d5c-kube-api-access-52bmx\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.800653 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.800549 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2ee72b8e-0396-4204-8f3d-185939377d5c-signing-cabundle\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.800653 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.800626 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2ee72b8e-0396-4204-8f3d-185939377d5c-signing-key\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.901513 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.901481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52bmx\" (UniqueName: \"kubernetes.io/projected/2ee72b8e-0396-4204-8f3d-185939377d5c-kube-api-access-52bmx\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.901626 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.901523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2ee72b8e-0396-4204-8f3d-185939377d5c-signing-cabundle\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.901781 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.901760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2ee72b8e-0396-4204-8f3d-185939377d5c-signing-key\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.902212 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.902189 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2ee72b8e-0396-4204-8f3d-185939377d5c-signing-cabundle\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.903886 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.903868 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2ee72b8e-0396-4204-8f3d-185939377d5c-signing-key\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:02.911533 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:02.911514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52bmx\" (UniqueName: \"kubernetes.io/projected/2ee72b8e-0396-4204-8f3d-185939377d5c-kube-api-access-52bmx\") pod \"service-ca-865cb79987-g5cv2\" (UID: \"2ee72b8e-0396-4204-8f3d-185939377d5c\") " pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:03.020638 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:03.020577 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-66f4955778-rrcqf" podUID="d1e036c3-8dac-4562-9015-62e7e9f36238" Apr 16 22:16:03.033909 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:03.033887 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rpbsg" podUID="7b7fa80a-7e5b-4b14-8792-ff01bd1f2143" Apr 16 22:16:03.057953 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:03.057933 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-sfffv" podUID="ed1be888-8420-4861-992a-ffd27fc02a14" Apr 16 22:16:03.072171 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:03.072155 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-g5cv2" Apr 16 22:16:03.087484 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:03.087464 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:16:03.087572 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:03.087470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpbsg" Apr 16 22:16:03.191874 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:03.191846 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g5cv2"] Apr 16 22:16:03.196818 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:03.196781 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee72b8e_0396_4204_8f3d_185939377d5c.slice/crio-f5096bdf42ad9672c4f3aaf513f64d49e4a8987a54dec5b63414f1f932289466 WatchSource:0}: Error finding container f5096bdf42ad9672c4f3aaf513f64d49e4a8987a54dec5b63414f1f932289466: Status 404 returned error can't find the container with id f5096bdf42ad9672c4f3aaf513f64d49e4a8987a54dec5b63414f1f932289466 Apr 16 22:16:03.407455 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:03.407370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:16:03.407455 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:03.407430 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:16:03.407642 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:03.407513 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:03.407642 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:03.407529 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:16:03.407642 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:03.407580 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls podName:a242c550-213a-4a82-8bbb-01a37bbc13c5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:11.407562064 +0000 UTC m=+164.253330954 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5c4w2" (UID: "a242c550-213a-4a82-8bbb-01a37bbc13c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:03.407642 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:03.407597 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls podName:e3b13759-2e6b-457f-a585-d1648b1b4543 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:11.407590737 +0000 UTC m=+164.253359630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gscs" (UID: "e3b13759-2e6b-457f-a585-d1648b1b4543") : secret "samples-operator-tls" not found Apr 16 22:16:03.571489 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:03.571465 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-msw8q_e7417e8e-90d4-47a8-926c-b10f15f3a850/dns-node-resolver/0.log" Apr 16 22:16:04.090952 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:04.090918 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-g5cv2" event={"ID":"2ee72b8e-0396-4204-8f3d-185939377d5c","Type":"ContainerStarted","Data":"7f3221ddd9e2cfdc516d3045dc621cd049854eedac84a20e0911c3954c1ac439"} Apr 16 22:16:04.090952 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:04.090956 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-g5cv2" event={"ID":"2ee72b8e-0396-4204-8f3d-185939377d5c","Type":"ContainerStarted","Data":"f5096bdf42ad9672c4f3aaf513f64d49e4a8987a54dec5b63414f1f932289466"} Apr 16 22:16:04.134871 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:04.134821 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-g5cv2" podStartSLOduration=2.134808351 podStartE2EDuration="2.134808351s" podCreationTimestamp="2026-04-16 22:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:04.13442211 +0000 UTC m=+156.980191038" watchObservedRunningTime="2026-04-16 22:16:04.134808351 +0000 UTC m=+156.980577264" Apr 16 22:16:04.184738 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:04.184703 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flbzq_598903e0-6d7b-4392-b685-da66c0408923/node-ca/0.log" Apr 16 22:16:04.212697 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:04.212672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:04.213069 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:04.212789 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:04.213069 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:04.212844 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert podName:35b5e138-1f3a-4054-9071-6e12676b9b25 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:08.212827313 +0000 UTC m=+161.058596202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6z6fs" (UID: "35b5e138-1f3a-4054-9071-6e12676b9b25") : secret "networking-console-plugin-cert" not found Apr 16 22:16:04.688046 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:04.688000 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qgfjd" podUID="a2d7d39e-d19f-4a6e-8107-593903f29181" Apr 16 22:16:05.961693 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:05.961654 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:16:05.961693 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:05.961687 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:16:05.962242 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:05.962160 2579 scope.go:117] "RemoveContainer" containerID="4898d325a390cb9bcc33339f5e45d8cbec4e04f7ec1ed61a17d313a0f7b723b2" Apr 16 22:16:05.962384 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:05.962361 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-s2fv5_openshift-console-operator(42c17f94-3e64-4703-a384-9593c55d048f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" podUID="42c17f94-3e64-4703-a384-9593c55d048f" Apr 16 22:16:07.943031 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:07.943002 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:07.943050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:07.943075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") pod \"image-registry-66f4955778-rrcqf\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:07.943168 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:07.943173 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:07.943239 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f4955778-rrcqf: secret "image-registry-tls" not found Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:07.943242 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls podName:7b7fa80a-7e5b-4b14-8792-ff01bd1f2143 nodeName:}" failed. No retries permitted until 2026-04-16 22:18:09.943219431 +0000 UTC m=+282.788988336 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls") pod "dns-default-rpbsg" (UID: "7b7fa80a-7e5b-4b14-8792-ff01bd1f2143") : secret "dns-default-metrics-tls" not found Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:07.943174 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:07.943280 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls podName:d1e036c3-8dac-4562-9015-62e7e9f36238 nodeName:}" failed. No retries permitted until 2026-04-16 22:18:09.943267729 +0000 UTC m=+282.789036622 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls") pod "image-registry-66f4955778-rrcqf" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238") : secret "image-registry-tls" not found Apr 16 22:16:07.943497 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:07.943351 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert podName:ed1be888-8420-4861-992a-ffd27fc02a14 nodeName:}" failed. No retries permitted until 2026-04-16 22:18:09.943334832 +0000 UTC m=+282.789103727 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert") pod "ingress-canary-sfffv" (UID: "ed1be888-8420-4861-992a-ffd27fc02a14") : secret "canary-serving-cert" not found Apr 16 22:16:08.245827 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:08.245800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:08.245991 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:08.245955 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:08.246059 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:08.246036 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert podName:35b5e138-1f3a-4054-9071-6e12676b9b25 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:16.24601488 +0000 UTC m=+169.091783789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6z6fs" (UID: "35b5e138-1f3a-4054-9071-6e12676b9b25") : secret "networking-console-plugin-cert" not found Apr 16 22:16:11.472500 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:11.472420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:16:11.472500 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:11.472471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:16:11.472907 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:11.472570 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:11.472907 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:11.472649 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls podName:a242c550-213a-4a82-8bbb-01a37bbc13c5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:27.472630491 +0000 UTC m=+180.318399384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5c4w2" (UID: "a242c550-213a-4a82-8bbb-01a37bbc13c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:11.474685 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:11.474664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3b13759-2e6b-457f-a585-d1648b1b4543-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gscs\" (UID: \"e3b13759-2e6b-457f-a585-d1648b1b4543\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:16:11.556742 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:11.556705 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" Apr 16 22:16:11.683712 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:11.683685 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs"] Apr 16 22:16:12.109720 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:12.109683 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" event={"ID":"e3b13759-2e6b-457f-a585-d1648b1b4543","Type":"ContainerStarted","Data":"5ef9542b178c971b8d5d7eccc07cc0a84568b52c31e564e10e704f644429948d"} Apr 16 22:16:14.115980 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:14.115936 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" event={"ID":"e3b13759-2e6b-457f-a585-d1648b1b4543","Type":"ContainerStarted","Data":"c66bda5bb1830faface4e9604ec8c3a8883bb35c3ed596e035d5327d56a485a1"} Apr 16 22:16:14.116346 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:14.115986 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" event={"ID":"e3b13759-2e6b-457f-a585-d1648b1b4543","Type":"ContainerStarted","Data":"466dc7b50758c7ee42b77c9d5404b4fe1cc9f4f81383a8ce7eeab85661c012cc"} Apr 16 22:16:14.132480 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:14.132440 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gscs" podStartSLOduration=17.778226119 podStartE2EDuration="19.132429147s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="2026-04-16 22:16:11.720283739 +0000 UTC m=+164.566052629" lastFinishedPulling="2026-04-16 22:16:13.074486764 +0000 UTC m=+165.920255657" observedRunningTime="2026-04-16 22:16:14.131209441 +0000 UTC m=+166.976978366" watchObservedRunningTime="2026-04-16 22:16:14.132429147 +0000 UTC m=+166.978198058" Apr 16 22:16:16.318382 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:16.318343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:16.320683 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:16.320653 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35b5e138-1f3a-4054-9071-6e12676b9b25-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6z6fs\" (UID: \"35b5e138-1f3a-4054-9071-6e12676b9b25\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:16.470425 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:16.470388 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" Apr 16 22:16:16.598290 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:16.598225 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs"] Apr 16 22:16:16.600874 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:16.600843 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b5e138_1f3a_4054_9071_6e12676b9b25.slice/crio-d6b5a41341d557b747dbb708bc575be65fcbc291f6fa3dab66a5c41afb66a163 WatchSource:0}: Error finding container d6b5a41341d557b747dbb708bc575be65fcbc291f6fa3dab66a5c41afb66a163: Status 404 returned error can't find the container with id d6b5a41341d557b747dbb708bc575be65fcbc291f6fa3dab66a5c41afb66a163 Apr 16 22:16:16.671827 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:16.671793 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:16:17.123467 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:17.123431 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" event={"ID":"35b5e138-1f3a-4054-9071-6e12676b9b25","Type":"ContainerStarted","Data":"d6b5a41341d557b747dbb708bc575be65fcbc291f6fa3dab66a5c41afb66a163"} Apr 16 22:16:18.126584 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:18.126553 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" event={"ID":"35b5e138-1f3a-4054-9071-6e12676b9b25","Type":"ContainerStarted","Data":"52f8c7202f0a7fbf84cef2f334f1374f32fd4d0cfb5f929bdc832cd5aa6e0c5a"} Apr 16 22:16:18.142221 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:18.142180 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6z6fs" podStartSLOduration=17.129534947 podStartE2EDuration="18.142167342s" podCreationTimestamp="2026-04-16 22:16:00 +0000 UTC" firstStartedPulling="2026-04-16 22:16:16.602737099 +0000 UTC m=+169.448506010" lastFinishedPulling="2026-04-16 22:16:17.615369512 +0000 UTC m=+170.461138405" observedRunningTime="2026-04-16 22:16:18.141051486 +0000 UTC m=+170.986820395" watchObservedRunningTime="2026-04-16 22:16:18.142167342 +0000 UTC m=+170.987936305" Apr 16 22:16:18.672316 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:18.672285 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:16:18.672521 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:18.672506 2579 scope.go:117] "RemoveContainer" containerID="4898d325a390cb9bcc33339f5e45d8cbec4e04f7ec1ed61a17d313a0f7b723b2" Apr 16 22:16:19.130811 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:19.130783 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:16:19.131165 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:19.130841 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" event={"ID":"42c17f94-3e64-4703-a384-9593c55d048f","Type":"ContainerStarted","Data":"3f5298904c0316e4e91c0b63a48888f3e418e433cbd4de96879834257440e995"} Apr 16 22:16:19.147939 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:19.147899 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" podStartSLOduration=21.304146863 podStartE2EDuration="24.147888727s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="2026-04-16 22:15:56.089438173 +0000 UTC m=+148.935207063" lastFinishedPulling="2026-04-16 22:15:58.933180037 +0000 UTC m=+151.778948927" observedRunningTime="2026-04-16 22:16:19.146714445 +0000 UTC m=+171.992483359" watchObservedRunningTime="2026-04-16 22:16:19.147888727 +0000 UTC m=+171.993657639" Apr 16 22:16:25.336030 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.336000 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-lbd7f"] Apr 16 22:16:25.338920 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.338901 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8wkch"] Apr 16 22:16:25.339092 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.339072 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lbd7f" Apr 16 22:16:25.342041 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.342017 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:16:25.342166 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.342092 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.342363 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.342294 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:16:25.342450 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.342385 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-zvn68\"" Apr 16 22:16:25.345182 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.345164 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:25.345794 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.345779 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:25.345949 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.345929 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bpg4g\"" Apr 16 22:16:25.353684 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.353661 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lbd7f"] Apr 16 22:16:25.357924 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.357904 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8wkch"] Apr 16 22:16:25.492241 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.492208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8b49\" (UniqueName: \"kubernetes.io/projected/39788394-ead2-4c74-8022-fedd8f2c6a08-kube-api-access-j8b49\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.492405 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.492257 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39788394-ead2-4c74-8022-fedd8f2c6a08-crio-socket\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.492405 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.492320 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39788394-ead2-4c74-8022-fedd8f2c6a08-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.492405 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.492363 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39788394-ead2-4c74-8022-fedd8f2c6a08-data-volume\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.492536 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.492486 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39788394-ead2-4c74-8022-fedd8f2c6a08-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.492582 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.492548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6n4q\" (UniqueName: \"kubernetes.io/projected/e2010ec2-eb19-4021-a065-02a91f2ca7ee-kube-api-access-r6n4q\") pod \"downloads-6bcc868b7-lbd7f\" (UID: \"e2010ec2-eb19-4021-a065-02a91f2ca7ee\") " pod="openshift-console/downloads-6bcc868b7-lbd7f" Apr 16 22:16:25.593197 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8b49\" (UniqueName: \"kubernetes.io/projected/39788394-ead2-4c74-8022-fedd8f2c6a08-kube-api-access-j8b49\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.593197 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39788394-ead2-4c74-8022-fedd8f2c6a08-crio-socket\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.593197 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593174 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39788394-ead2-4c74-8022-fedd8f2c6a08-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.593197 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39788394-ead2-4c74-8022-fedd8f2c6a08-data-volume\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.593506 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593265 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39788394-ead2-4c74-8022-fedd8f2c6a08-crio-socket\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.593506 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39788394-ead2-4c74-8022-fedd8f2c6a08-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.593506 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6n4q\" (UniqueName: \"kubernetes.io/projected/e2010ec2-eb19-4021-a065-02a91f2ca7ee-kube-api-access-r6n4q\") pod \"downloads-6bcc868b7-lbd7f\" (UID: \"e2010ec2-eb19-4021-a065-02a91f2ca7ee\") " pod="openshift-console/downloads-6bcc868b7-lbd7f" Apr 16 22:16:25.593506 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39788394-ead2-4c74-8022-fedd8f2c6a08-data-volume\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.593875 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.593860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39788394-ead2-4c74-8022-fedd8f2c6a08-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.595531 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.595508 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39788394-ead2-4c74-8022-fedd8f2c6a08-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.609892 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.609862 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8b49\" (UniqueName: \"kubernetes.io/projected/39788394-ead2-4c74-8022-fedd8f2c6a08-kube-api-access-j8b49\") pod \"insights-runtime-extractor-8wkch\" (UID: \"39788394-ead2-4c74-8022-fedd8f2c6a08\") " pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.611386 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.611363 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6n4q\" (UniqueName: \"kubernetes.io/projected/e2010ec2-eb19-4021-a065-02a91f2ca7ee-kube-api-access-r6n4q\") pod \"downloads-6bcc868b7-lbd7f\" (UID: \"e2010ec2-eb19-4021-a065-02a91f2ca7ee\") " pod="openshift-console/downloads-6bcc868b7-lbd7f" Apr 16 22:16:25.649547 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.649525 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lbd7f" Apr 16 22:16:25.656237 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.656216 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8wkch" Apr 16 22:16:25.785482 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.785454 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lbd7f"] Apr 16 22:16:25.788016 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:25.787988 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2010ec2_eb19_4021_a065_02a91f2ca7ee.slice/crio-0deb583402173f19cc027b898b606f57bc93c0aaeff13515370092d31dd759c0 WatchSource:0}: Error finding container 0deb583402173f19cc027b898b606f57bc93c0aaeff13515370092d31dd759c0: Status 404 returned error can't find the container with id 0deb583402173f19cc027b898b606f57bc93c0aaeff13515370092d31dd759c0 Apr 16 22:16:25.830243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:25.830218 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8wkch"] Apr 16 22:16:25.832988 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:25.832963 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39788394_ead2_4c74_8022_fedd8f2c6a08.slice/crio-013b8b6e1a66ad7ef26cd88cc1e46a0ee5c1a6fac58e889d62d19fffbbd8869c WatchSource:0}: Error finding container 013b8b6e1a66ad7ef26cd88cc1e46a0ee5c1a6fac58e889d62d19fffbbd8869c: Status 404 returned error can't find the container with id 013b8b6e1a66ad7ef26cd88cc1e46a0ee5c1a6fac58e889d62d19fffbbd8869c Apr 16 22:16:26.150242 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:26.150160 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8wkch" event={"ID":"39788394-ead2-4c74-8022-fedd8f2c6a08","Type":"ContainerStarted","Data":"320b2cbe64f4f47dbddb28215244d3e9b1971064e3346ab4427edc0d03360f16"} Apr 16 22:16:26.150242 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:26.150200 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8wkch" event={"ID":"39788394-ead2-4c74-8022-fedd8f2c6a08","Type":"ContainerStarted","Data":"013b8b6e1a66ad7ef26cd88cc1e46a0ee5c1a6fac58e889d62d19fffbbd8869c"} Apr 16 22:16:26.151124 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:26.151099 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lbd7f" event={"ID":"e2010ec2-eb19-4021-a065-02a91f2ca7ee","Type":"ContainerStarted","Data":"0deb583402173f19cc027b898b606f57bc93c0aaeff13515370092d31dd759c0"} Apr 16 22:16:27.155920 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:27.155885 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8wkch" event={"ID":"39788394-ead2-4c74-8022-fedd8f2c6a08","Type":"ContainerStarted","Data":"3621ff883eb0f6e25780be5a833f9cb2b2dc4fe59d0ac036dc32023ad4c0c26f"} Apr 16 22:16:27.509994 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:27.509951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:16:27.513071 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:27.513022 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a242c550-213a-4a82-8bbb-01a37bbc13c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5c4w2\" (UID: \"a242c550-213a-4a82-8bbb-01a37bbc13c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:16:27.749202 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:27.749172 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-sxgmc\"" Apr 16 22:16:27.757278 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:27.757244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" Apr 16 22:16:28.032561 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.032533 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-655c56db74-ht6dw"] Apr 16 22:16:28.036978 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.036958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.040448 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.040400 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:28.040659 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.040622 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:28.040860 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.040843 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:28.041024 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.040987 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:28.041442 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.041154 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-v9r8c\"" Apr 16 22:16:28.041442 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.041404 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:28.049880 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.049842 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-655c56db74-ht6dw"] Apr 16 22:16:28.073118 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.073086 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2"] Apr 16 22:16:28.075594 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:28.075569 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda242c550_213a_4a82_8bbb_01a37bbc13c5.slice/crio-9e4e3b11923b8998758a14dde1d07563b6e08e752d018944e62afc34574a6e54 WatchSource:0}: Error finding container 9e4e3b11923b8998758a14dde1d07563b6e08e752d018944e62afc34574a6e54: Status 404 returned error can't find the container with id 9e4e3b11923b8998758a14dde1d07563b6e08e752d018944e62afc34574a6e54 Apr 16 22:16:28.114434 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.114405 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-service-ca\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.114602 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.114466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-serving-cert\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.114602 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.114523 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qqd\" (UniqueName: \"kubernetes.io/projected/49536ae0-0390-4157-8cd9-1c2a5e86badf-kube-api-access-z9qqd\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.114710 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.114613 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-config\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.114791 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.114780 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-oauth-serving-cert\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.114852 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.114822 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-oauth-config\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.159457 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.159426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" event={"ID":"a242c550-213a-4a82-8bbb-01a37bbc13c5","Type":"ContainerStarted","Data":"9e4e3b11923b8998758a14dde1d07563b6e08e752d018944e62afc34574a6e54"} Apr 16 22:16:28.161632 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.161607 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8wkch" event={"ID":"39788394-ead2-4c74-8022-fedd8f2c6a08","Type":"ContainerStarted","Data":"9fb3bae4f24e6a5db460c24309cd00af5f1912d9ad15d4e9a4c9fc553ed579b1"} Apr 16 22:16:28.180867 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.180660 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8wkch" podStartSLOduration=1.075060547 podStartE2EDuration="3.180646387s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:25.880606349 +0000 UTC m=+178.726375239" lastFinishedPulling="2026-04-16 22:16:27.986192175 +0000 UTC m=+180.831961079" observedRunningTime="2026-04-16 22:16:28.180224977 +0000 UTC m=+181.025993879" watchObservedRunningTime="2026-04-16 22:16:28.180646387 +0000 UTC m=+181.026415302" Apr 16 22:16:28.215691 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.215631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-config\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.215691 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.215670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-oauth-serving-cert\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.215888 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.215694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-oauth-config\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.215888 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.215790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-service-ca\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.215888 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.215859 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-serving-cert\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.215888 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.215883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qqd\" (UniqueName: \"kubernetes.io/projected/49536ae0-0390-4157-8cd9-1c2a5e86badf-kube-api-access-z9qqd\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.216478 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.216459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-config\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.216542 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.216502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-service-ca\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.216542 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.216506 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-oauth-serving-cert\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.219087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.219063 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-oauth-config\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.219273 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.219251 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-serving-cert\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.225006 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.224979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qqd\" (UniqueName: \"kubernetes.io/projected/49536ae0-0390-4157-8cd9-1c2a5e86badf-kube-api-access-z9qqd\") pod \"console-655c56db74-ht6dw\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.352044 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.352013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:28.486677 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:28.486578 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-655c56db74-ht6dw"] Apr 16 22:16:28.490526 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:28.490493 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49536ae0_0390_4157_8cd9_1c2a5e86badf.slice/crio-927c1604f8548d123864e67aeece63365fc7a0fd137408a061b845cccfd2e881 WatchSource:0}: Error finding container 927c1604f8548d123864e67aeece63365fc7a0fd137408a061b845cccfd2e881: Status 404 returned error can't find the container with id 927c1604f8548d123864e67aeece63365fc7a0fd137408a061b845cccfd2e881 Apr 16 22:16:29.131543 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:29.131276 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:16:29.136613 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:29.136587 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-s2fv5" Apr 16 22:16:29.165806 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:29.165772 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655c56db74-ht6dw" event={"ID":"49536ae0-0390-4157-8cd9-1c2a5e86badf","Type":"ContainerStarted","Data":"927c1604f8548d123864e67aeece63365fc7a0fd137408a061b845cccfd2e881"} Apr 16 22:16:30.175286 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.175201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" event={"ID":"a242c550-213a-4a82-8bbb-01a37bbc13c5","Type":"ContainerStarted","Data":"8ea092e7527ac26c4db0e9a625d0e583518c91ae71867c5df3baa4b98ef2b23a"} Apr 16 22:16:30.196159 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.196042 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5c4w2" podStartSLOduration=33.482289998 podStartE2EDuration="35.196025524s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="2026-04-16 22:16:28.078208676 +0000 UTC m=+180.923977569" lastFinishedPulling="2026-04-16 22:16:29.79194419 +0000 UTC m=+182.637713095" observedRunningTime="2026-04-16 22:16:30.194262688 +0000 UTC m=+183.040031601" watchObservedRunningTime="2026-04-16 22:16:30.196025524 +0000 UTC m=+183.041794437" Apr 16 22:16:30.324343 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.323389 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm"] Apr 16 22:16:30.327392 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.326950 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" Apr 16 22:16:30.329974 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.329946 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 22:16:30.330438 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.330419 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-cxbzf\"" Apr 16 22:16:30.338114 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.338076 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm"] Apr 16 22:16:30.439386 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.439310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/45c088ad-918e-444c-8625-7a0f758483d7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hdjfm\" (UID: \"45c088ad-918e-444c-8625-7a0f758483d7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" Apr 16 22:16:30.540888 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:30.540799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/45c088ad-918e-444c-8625-7a0f758483d7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hdjfm\" (UID: \"45c088ad-918e-444c-8625-7a0f758483d7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" Apr 16 22:16:30.541079 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:30.540954 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 22:16:30.541079 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:30.541022 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c088ad-918e-444c-8625-7a0f758483d7-tls-certificates podName:45c088ad-918e-444c-8625-7a0f758483d7 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:31.041000019 +0000 UTC m=+183.886768912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/45c088ad-918e-444c-8625-7a0f758483d7-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-hdjfm" (UID: "45c088ad-918e-444c-8625-7a0f758483d7") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 22:16:31.045161 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:31.045115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/45c088ad-918e-444c-8625-7a0f758483d7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hdjfm\" (UID: \"45c088ad-918e-444c-8625-7a0f758483d7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" Apr 16 22:16:31.048799 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:31.048747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/45c088ad-918e-444c-8625-7a0f758483d7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hdjfm\" (UID: \"45c088ad-918e-444c-8625-7a0f758483d7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" Apr 16 22:16:31.240841 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:31.240812 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" Apr 16 22:16:31.775154 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:31.775129 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm"] Apr 16 22:16:31.778229 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:31.778198 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c088ad_918e_444c_8625_7a0f758483d7.slice/crio-0928d2284657ba5265880f954a771e844c018786d4e49a82e6e1f88f32e9d1ca WatchSource:0}: Error finding container 0928d2284657ba5265880f954a771e844c018786d4e49a82e6e1f88f32e9d1ca: Status 404 returned error can't find the container with id 0928d2284657ba5265880f954a771e844c018786d4e49a82e6e1f88f32e9d1ca Apr 16 22:16:32.183387 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:32.183329 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" event={"ID":"45c088ad-918e-444c-8625-7a0f758483d7","Type":"ContainerStarted","Data":"0928d2284657ba5265880f954a771e844c018786d4e49a82e6e1f88f32e9d1ca"} Apr 16 22:16:32.184924 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:32.184880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655c56db74-ht6dw" event={"ID":"49536ae0-0390-4157-8cd9-1c2a5e86badf","Type":"ContainerStarted","Data":"73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6"} Apr 16 22:16:32.204591 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:32.204548 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-655c56db74-ht6dw" podStartSLOduration=0.999242905 podStartE2EDuration="4.204535236s" podCreationTimestamp="2026-04-16 22:16:28 +0000 UTC" firstStartedPulling="2026-04-16 22:16:28.4934691 +0000 UTC m=+181.339237993" lastFinishedPulling="2026-04-16 22:16:31.69876143 +0000 UTC m=+184.544530324" observedRunningTime="2026-04-16 22:16:32.203070715 +0000 UTC m=+185.048839629" watchObservedRunningTime="2026-04-16 22:16:32.204535236 +0000 UTC m=+185.050304152" Apr 16 22:16:33.189506 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:33.189413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" event={"ID":"45c088ad-918e-444c-8625-7a0f758483d7","Type":"ContainerStarted","Data":"fd825747baa6bfef59088d5beaac206e209796367eb953d1e96ada3ed40eec0e"} Apr 16 22:16:33.208542 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:33.208459 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" podStartSLOduration=2.168669427 podStartE2EDuration="3.208441597s" podCreationTimestamp="2026-04-16 22:16:30 +0000 UTC" firstStartedPulling="2026-04-16 22:16:31.780572178 +0000 UTC m=+184.626341068" lastFinishedPulling="2026-04-16 22:16:32.820344348 +0000 UTC m=+185.666113238" observedRunningTime="2026-04-16 22:16:33.20830099 +0000 UTC m=+186.054069903" watchObservedRunningTime="2026-04-16 22:16:33.208441597 +0000 UTC m=+186.054210510" Apr 16 22:16:34.192676 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:34.192641 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" Apr 16 22:16:34.199229 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:34.199182 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdjfm" Apr 16 22:16:38.352538 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:38.352503 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:38.353049 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:38.352663 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:38.358280 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:38.358257 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:39.210537 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:39.210507 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:16:40.014229 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.014161 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x"] Apr 16 22:16:40.018299 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.018267 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.022595 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.022009 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-64xk5"] Apr 16 22:16:40.025112 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.024924 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 22:16:40.025813 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.025791 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:16:40.026031 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.026015 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:40.026105 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.026015 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qccb7\"" Apr 16 22:16:40.027266 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.027132 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.054791 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.054767 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:40.055036 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.055007 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gp7ks\"" Apr 16 22:16:40.055140 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.055112 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:40.055211 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.055186 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:40.056592 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.056533 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x"] Apr 16 22:16:40.058020 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.057997 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xwxff"] Apr 16 22:16:40.061431 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.061414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.064789 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.064768 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:16:40.065071 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.065056 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 22:16:40.065311 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.065282 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 22:16:40.065511 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.065490 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-fvxh9\"" Apr 16 22:16:40.082601 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.082564 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xwxff"] Apr 16 22:16:40.123462 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123436 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-sys\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.123618 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123486 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad29900d-bb7e-4e80-87db-950ad0387eff-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.123618 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.123618 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33bca243-d299-49da-a48f-9f0396daab4e-metrics-client-ca\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.123618 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123605 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.123868 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-textfile\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.123868 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-tls\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.123868 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b8km\" (UniqueName: \"kubernetes.io/projected/33bca243-d299-49da-a48f-9f0396daab4e-kube-api-access-5b8km\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.123868 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7s8\" (UniqueName: \"kubernetes.io/projected/ad29900d-bb7e-4e80-87db-950ad0387eff-kube-api-access-2d7s8\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.123868 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad29900d-bb7e-4e80-87db-950ad0387eff-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.124118 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-root\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.124118 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.123926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-wtmp\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.124118 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.124006 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad29900d-bb7e-4e80-87db-950ad0387eff-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.224700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224662 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdwq9\" (UniqueName: \"kubernetes.io/projected/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-api-access-pdwq9\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.224889 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad29900d-bb7e-4e80-87db-950ad0387eff-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.224889 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224755 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47cb66de-e3fc-492e-a19a-947935e77c6d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.224889 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224795 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-root\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.224889 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-wtmp\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-root\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/47cb66de-e3fc-492e-a19a-947935e77c6d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.225087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-wtmp\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.224979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad29900d-bb7e-4e80-87db-950ad0387eff-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.225087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225014 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-sys\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad29900d-bb7e-4e80-87db-950ad0387eff-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.225087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225059 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33bca243-d299-49da-a48f-9f0396daab4e-sys\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.225449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33bca243-d299-49da-a48f-9f0396daab4e-metrics-client-ca\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225319 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.225449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.225449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-textfile\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225449 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-tls\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225839 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b8km\" (UniqueName: \"kubernetes.io/projected/33bca243-d299-49da-a48f-9f0396daab4e-kube-api-access-5b8km\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.225839 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7s8\" (UniqueName: \"kubernetes.io/projected/ad29900d-bb7e-4e80-87db-950ad0387eff-kube-api-access-2d7s8\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.225839 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225755 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad29900d-bb7e-4e80-87db-950ad0387eff-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.225987 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.225888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.226149 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:40.226130 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:40.226239 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:40.226223 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-tls podName:33bca243-d299-49da-a48f-9f0396daab4e nodeName:}" failed. No retries permitted until 2026-04-16 22:16:40.726202311 +0000 UTC m=+193.571971203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-tls") pod "node-exporter-64xk5" (UID: "33bca243-d299-49da-a48f-9f0396daab4e") : secret "node-exporter-tls" not found Apr 16 22:16:40.226595 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.226546 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-textfile\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.227052 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.227034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33bca243-d299-49da-a48f-9f0396daab4e-metrics-client-ca\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.228652 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.228633 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.228756 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.228697 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad29900d-bb7e-4e80-87db-950ad0387eff-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.228822 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.228780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad29900d-bb7e-4e80-87db-950ad0387eff-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.238447 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.238424 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7s8\" (UniqueName: \"kubernetes.io/projected/ad29900d-bb7e-4e80-87db-950ad0387eff-kube-api-access-2d7s8\") pod \"openshift-state-metrics-9d44df66c-m6k6x\" (UID: \"ad29900d-bb7e-4e80-87db-950ad0387eff\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.239422 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.239401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b8km\" (UniqueName: \"kubernetes.io/projected/33bca243-d299-49da-a48f-9f0396daab4e-kube-api-access-5b8km\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.326913 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.326825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.327061 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.326932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.327061 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:40.326999 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 22:16:40.327172 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:40.327080 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-tls podName:47cb66de-e3fc-492e-a19a-947935e77c6d nodeName:}" failed. No retries permitted until 2026-04-16 22:16:40.827060666 +0000 UTC m=+193.672829573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-xwxff" (UID: "47cb66de-e3fc-492e-a19a-947935e77c6d") : secret "kube-state-metrics-tls" not found Apr 16 22:16:40.327231 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.327198 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.327374 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.327339 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdwq9\" (UniqueName: \"kubernetes.io/projected/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-api-access-pdwq9\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.327458 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.327386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47cb66de-e3fc-492e-a19a-947935e77c6d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.327458 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.327441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/47cb66de-e3fc-492e-a19a-947935e77c6d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.327857 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.327797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/47cb66de-e3fc-492e-a19a-947935e77c6d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.327951 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.327897 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.328190 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.328169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47cb66de-e3fc-492e-a19a-947935e77c6d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.330160 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.330140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.332581 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.332561 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" Apr 16 22:16:40.344046 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.344024 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdwq9\" (UniqueName: \"kubernetes.io/projected/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-api-access-pdwq9\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.730720 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.730511 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-tls\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.733556 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.733533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/33bca243-d299-49da-a48f-9f0396daab4e-node-exporter-tls\") pod \"node-exporter-64xk5\" (UID: \"33bca243-d299-49da-a48f-9f0396daab4e\") " pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.831309 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.831249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.834097 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.834070 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/47cb66de-e3fc-492e-a19a-947935e77c6d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xwxff\" (UID: \"47cb66de-e3fc-492e-a19a-947935e77c6d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:40.905417 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.905383 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:40.910828 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.910807 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:40.914138 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.914115 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:16:40.914243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.914134 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:16:40.914243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.914187 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:16:40.914612 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.914593 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:16:40.915019 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.914994 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:16:40.915112 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.915102 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:16:40.915174 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.915134 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8vltk\"" Apr 16 22:16:40.915333 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.915304 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:16:40.915455 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.915416 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:16:40.915923 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.915906 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:16:40.931114 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.931095 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:40.939630 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.939606 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-64xk5" Apr 16 22:16:40.972466 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:40.972442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" Apr 16 22:16:41.033201 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033122 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033201 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033177 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-config-out\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033444 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033493 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-web-config\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033535 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnsrb\" (UniqueName: \"kubernetes.io/projected/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-kube-api-access-tnsrb\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.033682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033653 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.034153 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033715 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.034153 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.033771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.134579 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.134544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.134579 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.134589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.134858 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.134776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-web-config\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.134858 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.134825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.134858 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.134854 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnsrb\" (UniqueName: \"kubernetes.io/projected/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-kube-api-access-tnsrb\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.134899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.134969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.134996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135846 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.135220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135846 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.135289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135846 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.135326 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135846 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.135344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-config-out\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135846 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.135388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135846 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.135405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135846 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.135417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.135846 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:41.135446 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-trusted-ca-bundle podName:1a98cab9-f502-4096-b3bd-6cb5d95b5cbd nodeName:}" failed. No retries permitted until 2026-04-16 22:16:41.635428232 +0000 UTC m=+194.481197124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "1a98cab9-f502-4096-b3bd-6cb5d95b5cbd") : configmap references non-existent config key: ca-bundle.crt Apr 16 22:16:41.138771 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.138477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.138771 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.138477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.139170 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.139145 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.140244 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.140215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.140989 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.140866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.140989 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.140970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-web-config\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.140989 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.140978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.141189 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.141051 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.141189 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.141154 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-config-out\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.147516 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.147497 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnsrb\" (UniqueName: \"kubernetes.io/projected/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-kube-api-access-tnsrb\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.639934 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.639898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.640798 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.640765 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a98cab9-f502-4096-b3bd-6cb5d95b5cbd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.822516 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.822474 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:41.946936 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.946850 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6"] Apr 16 22:16:41.952138 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.952119 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:41.955597 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.955573 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 22:16:41.955707 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.955620 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 22:16:41.956079 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.956057 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-62uqrbj2c8gnc\"" Apr 16 22:16:41.956079 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.956070 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 22:16:41.956233 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.956079 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 22:16:41.956390 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.956373 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 22:16:41.956550 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.956535 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-d66bm\"" Apr 16 22:16:41.973554 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:41.973532 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6"] Apr 16 22:16:42.044573 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.044537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-metrics-client-ca\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.045007 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.044595 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.045007 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.044658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.045007 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.044742 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.045007 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.044823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-tls\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.045007 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.044872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.045007 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.044914 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bfvd\" (UniqueName: \"kubernetes.io/projected/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-kube-api-access-8bfvd\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.045007 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.044942 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-grpc-tls\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.145916 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.145883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-grpc-tls\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.146117 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.145942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-metrics-client-ca\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.146117 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.145997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.146117 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.146030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.146274 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.146246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.146740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.146341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-tls\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.146740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.146407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.146740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.146460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bfvd\" (UniqueName: \"kubernetes.io/projected/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-kube-api-access-8bfvd\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.146740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.146655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-metrics-client-ca\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.149224 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.149181 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.149329 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.149249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-grpc-tls\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.149329 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.149249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.149891 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.149852 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-tls\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.149997 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.149954 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.150087 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.150062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.155464 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.155441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bfvd\" (UniqueName: \"kubernetes.io/projected/c47807c1-3916-46cb-8aeb-41f3b3e23fd9-kube-api-access-8bfvd\") pod \"thanos-querier-9d8fbcf6f-pxjb6\" (UID: \"c47807c1-3916-46cb-8aeb-41f3b3e23fd9\") " pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.264117 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.264079 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:42.537154 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:42.537119 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33bca243_d299_49da_a48f_9f0396daab4e.slice/crio-ac152bb54b9219f51fd59b8cdca9c6108c8803c5a4af6d81649ccb347ad00b83 WatchSource:0}: Error finding container ac152bb54b9219f51fd59b8cdca9c6108c8803c5a4af6d81649ccb347ad00b83: Status 404 returned error can't find the container with id ac152bb54b9219f51fd59b8cdca9c6108c8803c5a4af6d81649ccb347ad00b83 Apr 16 22:16:42.710688 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:42.710641 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad29900d_bb7e_4e80_87db_950ad0387eff.slice/crio-2af99c9483e27baa05a7ff4baf5d0bfdb9130db82a3cf1c0dbd31a25150e1cda WatchSource:0}: Error finding container 2af99c9483e27baa05a7ff4baf5d0bfdb9130db82a3cf1c0dbd31a25150e1cda: Status 404 returned error can't find the container with id 2af99c9483e27baa05a7ff4baf5d0bfdb9130db82a3cf1c0dbd31a25150e1cda Apr 16 22:16:42.713496 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.713449 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x"] Apr 16 22:16:42.723440 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.723417 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xwxff"] Apr 16 22:16:42.726032 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:42.726002 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47cb66de_e3fc_492e_a19a_947935e77c6d.slice/crio-f93bc989ef5fd13f5cfa958c04656630d4f9473d30078eceb5181b0fa70cc133 WatchSource:0}: Error finding container f93bc989ef5fd13f5cfa958c04656630d4f9473d30078eceb5181b0fa70cc133: Status 404 returned error can't find the container with id f93bc989ef5fd13f5cfa958c04656630d4f9473d30078eceb5181b0fa70cc133 Apr 16 22:16:42.959295 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.959065 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6"] Apr 16 22:16:42.964545 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:42.964260 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc47807c1_3916_46cb_8aeb_41f3b3e23fd9.slice/crio-f6aaa44eb2851ea8dd2a3f5fad790881bb009dc9226a383b930dbb1be85055e2 WatchSource:0}: Error finding container f6aaa44eb2851ea8dd2a3f5fad790881bb009dc9226a383b930dbb1be85055e2: Status 404 returned error can't find the container with id f6aaa44eb2851ea8dd2a3f5fad790881bb009dc9226a383b930dbb1be85055e2 Apr 16 22:16:42.971437 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:42.970755 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:42.975547 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:42.975519 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a98cab9_f502_4096_b3bd_6cb5d95b5cbd.slice/crio-641736e6e529f195d2993f10e5ca8c195c5fcf5b33cb1a660725c9af765074b4 WatchSource:0}: Error finding container 641736e6e529f195d2993f10e5ca8c195c5fcf5b33cb1a660725c9af765074b4: Status 404 returned error can't find the container with id 641736e6e529f195d2993f10e5ca8c195c5fcf5b33cb1a660725c9af765074b4 Apr 16 22:16:43.220738 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.220637 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerStarted","Data":"641736e6e529f195d2993f10e5ca8c195c5fcf5b33cb1a660725c9af765074b4"} Apr 16 22:16:43.222442 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.222371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lbd7f" event={"ID":"e2010ec2-eb19-4021-a065-02a91f2ca7ee","Type":"ContainerStarted","Data":"05c7850d27fd70759ca381c6987cc5147dabf90197100bf91701055ef776cbba"} Apr 16 22:16:43.222987 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.222860 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-lbd7f" Apr 16 22:16:43.224688 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.224331 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" event={"ID":"c47807c1-3916-46cb-8aeb-41f3b3e23fd9","Type":"ContainerStarted","Data":"f6aaa44eb2851ea8dd2a3f5fad790881bb009dc9226a383b930dbb1be85055e2"} Apr 16 22:16:43.226442 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.226303 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-64xk5" event={"ID":"33bca243-d299-49da-a48f-9f0396daab4e","Type":"ContainerStarted","Data":"ac152bb54b9219f51fd59b8cdca9c6108c8803c5a4af6d81649ccb347ad00b83"} Apr 16 22:16:43.228595 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.227888 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" event={"ID":"47cb66de-e3fc-492e-a19a-947935e77c6d","Type":"ContainerStarted","Data":"f93bc989ef5fd13f5cfa958c04656630d4f9473d30078eceb5181b0fa70cc133"} Apr 16 22:16:43.230270 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.230181 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" event={"ID":"ad29900d-bb7e-4e80-87db-950ad0387eff","Type":"ContainerStarted","Data":"7fd4e7ab2f2c98ac9842233b52c9dd2419a170f36cede5a2506f1de655b23a82"} Apr 16 22:16:43.230270 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.230208 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" event={"ID":"ad29900d-bb7e-4e80-87db-950ad0387eff","Type":"ContainerStarted","Data":"5a6bc016f3f4deab94b2b6643a18dcd3ee485c0569b23255e1745b0956f362e0"} Apr 16 22:16:43.230270 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.230222 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" event={"ID":"ad29900d-bb7e-4e80-87db-950ad0387eff","Type":"ContainerStarted","Data":"2af99c9483e27baa05a7ff4baf5d0bfdb9130db82a3cf1c0dbd31a25150e1cda"} Apr 16 22:16:43.234088 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.234067 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-lbd7f" Apr 16 22:16:43.262020 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:43.261605 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-lbd7f" podStartSLOduration=1.405420242 podStartE2EDuration="18.261586516s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:25.789799579 +0000 UTC m=+178.635568473" lastFinishedPulling="2026-04-16 22:16:42.645965851 +0000 UTC m=+195.491734747" observedRunningTime="2026-04-16 22:16:43.242410262 +0000 UTC m=+196.088179177" watchObservedRunningTime="2026-04-16 22:16:43.261586516 +0000 UTC m=+196.107355431" Apr 16 22:16:44.243930 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.243775 2579 generic.go:358] "Generic (PLEG): container finished" podID="33bca243-d299-49da-a48f-9f0396daab4e" containerID="97d34bf73d33f725b0b7c0664005f04d82b5074dae8ff04758890b2fa7faa564" exitCode=0 Apr 16 22:16:44.244700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.244435 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-64xk5" event={"ID":"33bca243-d299-49da-a48f-9f0396daab4e","Type":"ContainerDied","Data":"97d34bf73d33f725b0b7c0664005f04d82b5074dae8ff04758890b2fa7faa564"} Apr 16 22:16:44.667424 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.667351 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8"] Apr 16 22:16:44.700884 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.700670 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8"] Apr 16 22:16:44.702328 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.701440 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" Apr 16 22:16:44.706099 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.705897 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-sn5qq\"" Apr 16 22:16:44.706227 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.706134 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 22:16:44.770539 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.770501 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cb55f8c68-f7znq"] Apr 16 22:16:44.781498 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.781362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54352985-5c56-411a-9cc0-f1fb17b7e0ab-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hkct8\" (UID: \"54352985-5c56-411a-9cc0-f1fb17b7e0ab\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" Apr 16 22:16:44.787322 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.787293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.797475 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.797319 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:16:44.805063 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.805040 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb55f8c68-f7znq"] Apr 16 22:16:44.882349 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.882227 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-oauth-config\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.882349 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.882290 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-oauth-serving-cert\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.882878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.882410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54352985-5c56-411a-9cc0-f1fb17b7e0ab-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hkct8\" (UID: \"54352985-5c56-411a-9cc0-f1fb17b7e0ab\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" Apr 16 22:16:44.882878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.882453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-config\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.882878 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:44.882579 2579 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 22:16:44.882878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.882632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-trusted-ca-bundle\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.882878 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:44.882660 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54352985-5c56-411a-9cc0-f1fb17b7e0ab-monitoring-plugin-cert podName:54352985-5c56-411a-9cc0-f1fb17b7e0ab nodeName:}" failed. No retries permitted until 2026-04-16 22:16:45.382639748 +0000 UTC m=+198.228408642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/54352985-5c56-411a-9cc0-f1fb17b7e0ab-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-hkct8" (UID: "54352985-5c56-411a-9cc0-f1fb17b7e0ab") : secret "monitoring-plugin-cert" not found Apr 16 22:16:44.882878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.882695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-service-ca\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.882878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.882791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-serving-cert\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.882878 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.882879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlhqq\" (UniqueName: \"kubernetes.io/projected/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-kube-api-access-hlhqq\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.983530 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.983487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-config\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.983700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.983553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-trusted-ca-bundle\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.983700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.983580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-service-ca\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.983700 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.983634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-serving-cert\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.983868 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.983708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlhqq\" (UniqueName: \"kubernetes.io/projected/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-kube-api-access-hlhqq\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.983868 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.983761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-oauth-config\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.983868 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.983797 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-oauth-serving-cert\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.984371 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.984340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-service-ca\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.984482 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.984404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-oauth-serving-cert\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.984482 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.984432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-trusted-ca-bundle\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.986982 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.986959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-oauth-config\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.987078 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.986955 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-serving-cert\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:44.993442 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:44.993419 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-config\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:45.000201 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:45.000177 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlhqq\" (UniqueName: \"kubernetes.io/projected/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-kube-api-access-hlhqq\") pod \"console-6cb55f8c68-f7znq\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:45.102210 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:45.102173 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:45.388894 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:45.388854 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54352985-5c56-411a-9cc0-f1fb17b7e0ab-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hkct8\" (UID: \"54352985-5c56-411a-9cc0-f1fb17b7e0ab\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" Apr 16 22:16:45.391966 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:45.391928 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/54352985-5c56-411a-9cc0-f1fb17b7e0ab-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hkct8\" (UID: \"54352985-5c56-411a-9cc0-f1fb17b7e0ab\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" Apr 16 22:16:45.396694 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:45.396660 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb55f8c68-f7znq"] Apr 16 22:16:45.616632 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:45.616549 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" Apr 16 22:16:45.982965 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:45.982920 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a27d86a_93d5_49db_91e6_4ccb8863b1d9.slice/crio-c90d5283f7fad872c77233477843beaff50a9e2443e7c52c59b7869f5a411de4 WatchSource:0}: Error finding container c90d5283f7fad872c77233477843beaff50a9e2443e7c52c59b7869f5a411de4: Status 404 returned error can't find the container with id c90d5283f7fad872c77233477843beaff50a9e2443e7c52c59b7869f5a411de4 Apr 16 22:16:46.161127 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.160906 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8"] Apr 16 22:16:46.168220 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:16:46.168191 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54352985_5c56_411a_9cc0_f1fb17b7e0ab.slice/crio-5afc69abbeefdc35c76e5646f1d7e1984c2c7a1b559900923a081c65b0702eac WatchSource:0}: Error finding container 5afc69abbeefdc35c76e5646f1d7e1984c2c7a1b559900923a081c65b0702eac: Status 404 returned error can't find the container with id 5afc69abbeefdc35c76e5646f1d7e1984c2c7a1b559900923a081c65b0702eac Apr 16 22:16:46.254921 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.254834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-64xk5" event={"ID":"33bca243-d299-49da-a48f-9f0396daab4e","Type":"ContainerStarted","Data":"7cdb9b5abfd80d50eea124e9338289122e7f602ef0ca27a77a0fec1c51e061f3"} Apr 16 22:16:46.256795 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.256760 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" event={"ID":"47cb66de-e3fc-492e-a19a-947935e77c6d","Type":"ContainerStarted","Data":"71e63c9539960875c9f54f9f0bc437ba38f0f3dda37cb1f82c240965805ac0ac"} Apr 16 22:16:46.258957 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.258931 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" event={"ID":"ad29900d-bb7e-4e80-87db-950ad0387eff","Type":"ContainerStarted","Data":"ff6dce543f820ad348247c5d45226b67e740bdaee1b4ac5fafdf1ed6914fc7aa"} Apr 16 22:16:46.260117 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.260093 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" event={"ID":"54352985-5c56-411a-9cc0-f1fb17b7e0ab","Type":"ContainerStarted","Data":"5afc69abbeefdc35c76e5646f1d7e1984c2c7a1b559900923a081c65b0702eac"} Apr 16 22:16:46.261892 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.261862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerStarted","Data":"f26553df6aac5792819251cf126e638b4ec0267abdc0fc23f63762b3d9649777"} Apr 16 22:16:46.263232 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.263209 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb55f8c68-f7znq" event={"ID":"4a27d86a-93d5-49db-91e6-4ccb8863b1d9","Type":"ContainerStarted","Data":"a51f1d20c5d561fb974ac4f4aebfb746ee20ea7b2412585720c9d19aa05904c5"} Apr 16 22:16:46.263311 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.263238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb55f8c68-f7znq" event={"ID":"4a27d86a-93d5-49db-91e6-4ccb8863b1d9","Type":"ContainerStarted","Data":"c90d5283f7fad872c77233477843beaff50a9e2443e7c52c59b7869f5a411de4"} Apr 16 22:16:46.360992 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.360933 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m6k6x" podStartSLOduration=4.943353773 podStartE2EDuration="7.360915895s" podCreationTimestamp="2026-04-16 22:16:39 +0000 UTC" firstStartedPulling="2026-04-16 22:16:42.828469295 +0000 UTC m=+195.674238186" lastFinishedPulling="2026-04-16 22:16:45.246031405 +0000 UTC m=+198.091800308" observedRunningTime="2026-04-16 22:16:46.360074507 +0000 UTC m=+199.205843419" watchObservedRunningTime="2026-04-16 22:16:46.360915895 +0000 UTC m=+199.206684812" Apr 16 22:16:46.477390 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:46.477314 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cb55f8c68-f7znq" podStartSLOduration=2.477295713 podStartE2EDuration="2.477295713s" podCreationTimestamp="2026-04-16 22:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:46.47507559 +0000 UTC m=+199.320844502" watchObservedRunningTime="2026-04-16 22:16:46.477295713 +0000 UTC m=+199.323064625" Apr 16 22:16:47.269721 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.269616 2579 generic.go:358] "Generic (PLEG): container finished" podID="1a98cab9-f502-4096-b3bd-6cb5d95b5cbd" containerID="f26553df6aac5792819251cf126e638b4ec0267abdc0fc23f63762b3d9649777" exitCode=0 Apr 16 22:16:47.269721 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.269697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerDied","Data":"f26553df6aac5792819251cf126e638b4ec0267abdc0fc23f63762b3d9649777"} Apr 16 22:16:47.273573 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.273541 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" event={"ID":"c47807c1-3916-46cb-8aeb-41f3b3e23fd9","Type":"ContainerStarted","Data":"80a2677a1c95c5d59689e3dcfb8ca0e13c595a834e0b3edd44ca8d80571db337"} Apr 16 22:16:47.273693 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.273582 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" event={"ID":"c47807c1-3916-46cb-8aeb-41f3b3e23fd9","Type":"ContainerStarted","Data":"bbf9e70a3c83bdf5137e2368a51f1d88ab1514baac0bb6d51a4f9e77dba842f6"} Apr 16 22:16:47.273693 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.273622 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" event={"ID":"c47807c1-3916-46cb-8aeb-41f3b3e23fd9","Type":"ContainerStarted","Data":"1282185bf04e28e3ac7f4df8070394dea1a79023d1a551c6263faf516cfa7071"} Apr 16 22:16:47.276434 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.276370 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-64xk5" event={"ID":"33bca243-d299-49da-a48f-9f0396daab4e","Type":"ContainerStarted","Data":"35918da314b7f5872dc7185833851ce166c1ab42c4e758945edbc32d5d04631f"} Apr 16 22:16:47.278848 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.278794 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" event={"ID":"47cb66de-e3fc-492e-a19a-947935e77c6d","Type":"ContainerStarted","Data":"76481374d05208eac2009f8b237fc5832cd9460d4beeeeb79c6782e296e608de"} Apr 16 22:16:47.278848 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.278833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" event={"ID":"47cb66de-e3fc-492e-a19a-947935e77c6d","Type":"ContainerStarted","Data":"205f5304c842c7c1edc42da66cf06eaf5450c420cc351338393768f5366584c2"} Apr 16 22:16:47.374769 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.374692 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xwxff" podStartSLOduration=4.856807852 podStartE2EDuration="7.374675662s" podCreationTimestamp="2026-04-16 22:16:40 +0000 UTC" firstStartedPulling="2026-04-16 22:16:42.728219136 +0000 UTC m=+195.573988029" lastFinishedPulling="2026-04-16 22:16:45.246086935 +0000 UTC m=+198.091855839" observedRunningTime="2026-04-16 22:16:47.373029801 +0000 UTC m=+200.218798714" watchObservedRunningTime="2026-04-16 22:16:47.374675662 +0000 UTC m=+200.220444574" Apr 16 22:16:47.698812 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.698648 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-64xk5" podStartSLOduration=7.799682186 podStartE2EDuration="8.698634838s" podCreationTimestamp="2026-04-16 22:16:39 +0000 UTC" firstStartedPulling="2026-04-16 22:16:42.566359196 +0000 UTC m=+195.412128093" lastFinishedPulling="2026-04-16 22:16:43.465311856 +0000 UTC m=+196.311080745" observedRunningTime="2026-04-16 22:16:47.404589776 +0000 UTC m=+200.250358688" watchObservedRunningTime="2026-04-16 22:16:47.698634838 +0000 UTC m=+200.544403788" Apr 16 22:16:47.969020 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:47.968774 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66f4955778-rrcqf"] Apr 16 22:16:47.969192 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:16:47.969135 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-66f4955778-rrcqf" podUID="d1e036c3-8dac-4562-9015-62e7e9f36238" Apr 16 22:16:48.284931 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.284889 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" event={"ID":"54352985-5c56-411a-9cc0-f1fb17b7e0ab","Type":"ContainerStarted","Data":"dc50aa7f43c45888a5d22244dc9cc16ce7712d2af852ed22c8873640c48ca630"} Apr 16 22:16:48.285461 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.285426 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:16:48.285894 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.285872 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" Apr 16 22:16:48.292528 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.292508 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" Apr 16 22:16:48.293234 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.293124 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:16:48.315089 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.315049 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hkct8" podStartSLOduration=2.712723767 podStartE2EDuration="4.315036593s" podCreationTimestamp="2026-04-16 22:16:44 +0000 UTC" firstStartedPulling="2026-04-16 22:16:46.1706992 +0000 UTC m=+199.016468093" lastFinishedPulling="2026-04-16 22:16:47.773012015 +0000 UTC m=+200.618780919" observedRunningTime="2026-04-16 22:16:48.314641022 +0000 UTC m=+201.160409934" watchObservedRunningTime="2026-04-16 22:16:48.315036593 +0000 UTC m=+201.160805505" Apr 16 22:16:48.428879 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.428848 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1e036c3-8dac-4562-9015-62e7e9f36238-ca-trust-extracted\") pod \"d1e036c3-8dac-4562-9015-62e7e9f36238\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " Apr 16 22:16:48.428879 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.428891 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpt7g\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-kube-api-access-zpt7g\") pod \"d1e036c3-8dac-4562-9015-62e7e9f36238\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " Apr 16 22:16:48.429160 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.428948 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-image-registry-private-configuration\") pod \"d1e036c3-8dac-4562-9015-62e7e9f36238\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " Apr 16 22:16:48.429160 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.429019 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-installation-pull-secrets\") pod \"d1e036c3-8dac-4562-9015-62e7e9f36238\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " Apr 16 22:16:48.429160 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.429051 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-certificates\") pod \"d1e036c3-8dac-4562-9015-62e7e9f36238\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " Apr 16 22:16:48.429160 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.429083 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-bound-sa-token\") pod \"d1e036c3-8dac-4562-9015-62e7e9f36238\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " Apr 16 22:16:48.429160 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.429114 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-trusted-ca\") pod \"d1e036c3-8dac-4562-9015-62e7e9f36238\" (UID: \"d1e036c3-8dac-4562-9015-62e7e9f36238\") " Apr 16 22:16:48.429552 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.429527 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e036c3-8dac-4562-9015-62e7e9f36238-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d1e036c3-8dac-4562-9015-62e7e9f36238" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:16:48.429842 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.429786 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d1e036c3-8dac-4562-9015-62e7e9f36238" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:48.429955 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.429841 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d1e036c3-8dac-4562-9015-62e7e9f36238" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:48.432131 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.432087 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-kube-api-access-zpt7g" (OuterVolumeSpecName: "kube-api-access-zpt7g") pod "d1e036c3-8dac-4562-9015-62e7e9f36238" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238"). InnerVolumeSpecName "kube-api-access-zpt7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:48.432379 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.432351 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d1e036c3-8dac-4562-9015-62e7e9f36238" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:48.433370 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.433343 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d1e036c3-8dac-4562-9015-62e7e9f36238" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:48.434091 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.433997 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d1e036c3-8dac-4562-9015-62e7e9f36238" (UID: "d1e036c3-8dac-4562-9015-62e7e9f36238"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:48.530037 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.530001 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-image-registry-private-configuration\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.530037 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.530040 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1e036c3-8dac-4562-9015-62e7e9f36238-installation-pull-secrets\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.530037 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.530056 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-certificates\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.530377 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.530072 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-bound-sa-token\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.530377 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.530086 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1e036c3-8dac-4562-9015-62e7e9f36238-trusted-ca\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.530377 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.530099 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1e036c3-8dac-4562-9015-62e7e9f36238-ca-trust-extracted\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.530377 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:48.530113 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpt7g\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-kube-api-access-zpt7g\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:16:49.288371 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:49.288342 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f4955778-rrcqf" Apr 16 22:16:49.357500 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:49.357462 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66f4955778-rrcqf"] Apr 16 22:16:49.362446 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:49.362163 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66f4955778-rrcqf"] Apr 16 22:16:49.440096 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:49.440060 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1e036c3-8dac-4562-9015-62e7e9f36238-registry-tls\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:16:49.678367 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:49.678288 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e036c3-8dac-4562-9015-62e7e9f36238" path="/var/lib/kubelet/pods/d1e036c3-8dac-4562-9015-62e7e9f36238/volumes" Apr 16 22:16:50.294796 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.294689 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerStarted","Data":"2c5abe1ace05787ffe9e6e5e9579cc73febbb24fe54e3a82360c55527c9f9726"} Apr 16 22:16:50.294796 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.294759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerStarted","Data":"742d8d93366f6a8276c7fec42b0e7a1347b71b34c4b3a159811ea39f25306358"} Apr 16 22:16:50.294796 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.294775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerStarted","Data":"c09197587f4b9a4d4ca3bd6fe25ec2f3fef423caa48f626307fdeec99eaada6b"} Apr 16 22:16:50.294796 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.294788 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerStarted","Data":"e0df99157f97fea337405aa323aa2dfaf3a1d18d2d915d2b27a0e01b570df87e"} Apr 16 22:16:50.295423 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.294801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerStarted","Data":"e130c2b6f7ec4a3520a274a46fb86ddfb8845968d1b629ee78e17a3486192666"} Apr 16 22:16:50.295423 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.294814 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a98cab9-f502-4096-b3bd-6cb5d95b5cbd","Type":"ContainerStarted","Data":"847ba172677838ac1cdcd470a694e166da51c5693db54e00bc3c30e65a69e986"} Apr 16 22:16:50.297765 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.297715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" event={"ID":"c47807c1-3916-46cb-8aeb-41f3b3e23fd9","Type":"ContainerStarted","Data":"9082130f8bc6d0f12c4b221c12f84b3dd4a91a06716c70fa486ee3c6139f29ab"} Apr 16 22:16:50.297932 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.297771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" event={"ID":"c47807c1-3916-46cb-8aeb-41f3b3e23fd9","Type":"ContainerStarted","Data":"61852c3077275d4ecbc1e9b6ded4427e38d9d312e7c181783d80673726144cf8"} Apr 16 22:16:50.297932 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.297786 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" event={"ID":"c47807c1-3916-46cb-8aeb-41f3b3e23fd9","Type":"ContainerStarted","Data":"a3df090fcaf18c9ae867379f2a254967e3d8c589ee57f9790d0d6c40327c2896"} Apr 16 22:16:50.329614 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.329565 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.048183497 podStartE2EDuration="10.32954804s" podCreationTimestamp="2026-04-16 22:16:40 +0000 UTC" firstStartedPulling="2026-04-16 22:16:42.978941263 +0000 UTC m=+195.824710168" lastFinishedPulling="2026-04-16 22:16:49.260305818 +0000 UTC m=+202.106074711" observedRunningTime="2026-04-16 22:16:50.32857305 +0000 UTC m=+203.174342000" watchObservedRunningTime="2026-04-16 22:16:50.32954804 +0000 UTC m=+203.175316954" Apr 16 22:16:50.361203 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:50.361155 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" podStartSLOduration=3.072755092 podStartE2EDuration="9.361140504s" podCreationTimestamp="2026-04-16 22:16:41 +0000 UTC" firstStartedPulling="2026-04-16 22:16:42.966601693 +0000 UTC m=+195.812370600" lastFinishedPulling="2026-04-16 22:16:49.254987115 +0000 UTC m=+202.100756012" observedRunningTime="2026-04-16 22:16:50.358750708 +0000 UTC m=+203.204519610" watchObservedRunningTime="2026-04-16 22:16:50.361140504 +0000 UTC m=+203.206909450" Apr 16 22:16:51.301987 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:51.301953 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:51.309317 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:51.309291 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-9d8fbcf6f-pxjb6" Apr 16 22:16:55.102601 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:55.102566 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:55.102601 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:55.102608 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:55.107427 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:55.107406 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:55.317995 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:55.317956 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:16:55.367516 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:16:55.367445 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-655c56db74-ht6dw"] Apr 16 22:17:15.375398 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:15.375321 2579 generic.go:358] "Generic (PLEG): container finished" podID="7db6d6a8-5304-4b41-87c9-a4f433031f6e" containerID="11cd8f539996c3cdfee779660442f868df3178fefa022ccbfc4a613138f8571e" exitCode=0 Apr 16 22:17:15.375856 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:15.375393 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-87gkq" event={"ID":"7db6d6a8-5304-4b41-87c9-a4f433031f6e","Type":"ContainerDied","Data":"11cd8f539996c3cdfee779660442f868df3178fefa022ccbfc4a613138f8571e"} Apr 16 22:17:15.375856 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:15.375656 2579 scope.go:117] "RemoveContainer" containerID="11cd8f539996c3cdfee779660442f868df3178fefa022ccbfc4a613138f8571e" Apr 16 22:17:16.380153 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:16.380120 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-87gkq" event={"ID":"7db6d6a8-5304-4b41-87c9-a4f433031f6e","Type":"ContainerStarted","Data":"e217f1936ce113d1b85028aec0c2f9027c320da8378e4497cd0db1cb5c6390b9"} Apr 16 22:17:20.389014 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.388974 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-655c56db74-ht6dw" podUID="49536ae0-0390-4157-8cd9-1c2a5e86badf" containerName="console" containerID="cri-o://73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6" gracePeriod=15 Apr 16 22:17:20.392007 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.391980 2579 generic.go:358] "Generic (PLEG): container finished" podID="3d72e208-6678-427c-826b-098451ce245c" containerID="d3334edb5c3db134ffd9da7bf592ad497a4aad207e31c8fc6036e1a03d191365" exitCode=0 Apr 16 22:17:20.392124 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.392055 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" event={"ID":"3d72e208-6678-427c-826b-098451ce245c","Type":"ContainerDied","Data":"d3334edb5c3db134ffd9da7bf592ad497a4aad207e31c8fc6036e1a03d191365"} Apr 16 22:17:20.392404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.392389 2579 scope.go:117] "RemoveContainer" containerID="d3334edb5c3db134ffd9da7bf592ad497a4aad207e31c8fc6036e1a03d191365" Apr 16 22:17:20.642821 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.642772 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-655c56db74-ht6dw_49536ae0-0390-4157-8cd9-1c2a5e86badf/console/0.log" Apr 16 22:17:20.642923 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.642838 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:17:20.703947 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.703923 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-config\") pod \"49536ae0-0390-4157-8cd9-1c2a5e86badf\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " Apr 16 22:17:20.704058 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.703961 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-serving-cert\") pod \"49536ae0-0390-4157-8cd9-1c2a5e86badf\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " Apr 16 22:17:20.704058 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.703987 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9qqd\" (UniqueName: \"kubernetes.io/projected/49536ae0-0390-4157-8cd9-1c2a5e86badf-kube-api-access-z9qqd\") pod \"49536ae0-0390-4157-8cd9-1c2a5e86badf\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " Apr 16 22:17:20.704058 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.704011 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-service-ca\") pod \"49536ae0-0390-4157-8cd9-1c2a5e86badf\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " Apr 16 22:17:20.704058 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.704025 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-oauth-serving-cert\") pod \"49536ae0-0390-4157-8cd9-1c2a5e86badf\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " Apr 16 22:17:20.704258 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.704081 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-oauth-config\") pod \"49536ae0-0390-4157-8cd9-1c2a5e86badf\" (UID: \"49536ae0-0390-4157-8cd9-1c2a5e86badf\") " Apr 16 22:17:20.704434 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.704397 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-config" (OuterVolumeSpecName: "console-config") pod "49536ae0-0390-4157-8cd9-1c2a5e86badf" (UID: "49536ae0-0390-4157-8cd9-1c2a5e86badf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:20.704527 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.704443 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-service-ca" (OuterVolumeSpecName: "service-ca") pod "49536ae0-0390-4157-8cd9-1c2a5e86badf" (UID: "49536ae0-0390-4157-8cd9-1c2a5e86badf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:20.704527 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.704505 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "49536ae0-0390-4157-8cd9-1c2a5e86badf" (UID: "49536ae0-0390-4157-8cd9-1c2a5e86badf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:20.706172 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.706148 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "49536ae0-0390-4157-8cd9-1c2a5e86badf" (UID: "49536ae0-0390-4157-8cd9-1c2a5e86badf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:20.706278 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.706259 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49536ae0-0390-4157-8cd9-1c2a5e86badf-kube-api-access-z9qqd" (OuterVolumeSpecName: "kube-api-access-z9qqd") pod "49536ae0-0390-4157-8cd9-1c2a5e86badf" (UID: "49536ae0-0390-4157-8cd9-1c2a5e86badf"). InnerVolumeSpecName "kube-api-access-z9qqd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:20.706324 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.706303 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "49536ae0-0390-4157-8cd9-1c2a5e86badf" (UID: "49536ae0-0390-4157-8cd9-1c2a5e86badf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:20.804928 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.804899 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-oauth-config\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:17:20.804928 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.804927 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-config\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:17:20.805086 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.804937 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49536ae0-0390-4157-8cd9-1c2a5e86badf-console-serving-cert\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:17:20.805086 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.804946 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z9qqd\" (UniqueName: \"kubernetes.io/projected/49536ae0-0390-4157-8cd9-1c2a5e86badf-kube-api-access-z9qqd\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:17:20.805086 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.804955 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-service-ca\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:17:20.805086 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:20.804964 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49536ae0-0390-4157-8cd9-1c2a5e86badf-oauth-serving-cert\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:17:21.395880 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.395850 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-655c56db74-ht6dw_49536ae0-0390-4157-8cd9-1c2a5e86badf/console/0.log" Apr 16 22:17:21.396349 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.395887 2579 generic.go:358] "Generic (PLEG): container finished" podID="49536ae0-0390-4157-8cd9-1c2a5e86badf" containerID="73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6" exitCode=2 Apr 16 22:17:21.396349 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.395950 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655c56db74-ht6dw" Apr 16 22:17:21.396349 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.395982 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655c56db74-ht6dw" event={"ID":"49536ae0-0390-4157-8cd9-1c2a5e86badf","Type":"ContainerDied","Data":"73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6"} Apr 16 22:17:21.396349 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.396029 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655c56db74-ht6dw" event={"ID":"49536ae0-0390-4157-8cd9-1c2a5e86badf","Type":"ContainerDied","Data":"927c1604f8548d123864e67aeece63365fc7a0fd137408a061b845cccfd2e881"} Apr 16 22:17:21.396349 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.396053 2579 scope.go:117] "RemoveContainer" containerID="73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6" Apr 16 22:17:21.397703 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.397686 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nbj62" event={"ID":"3d72e208-6678-427c-826b-098451ce245c","Type":"ContainerStarted","Data":"15ce2dfd52e16937482de38a458ee6998a35589447ac2c0803e41db06af0cd4a"} Apr 16 22:17:21.404621 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.404606 2579 scope.go:117] "RemoveContainer" containerID="73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6" Apr 16 22:17:21.404881 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:17:21.404862 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6\": container with ID starting with 73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6 not found: ID does not exist" containerID="73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6" Apr 16 22:17:21.404938 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.404889 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6"} err="failed to get container status \"73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6\": rpc error: code = NotFound desc = could not find container \"73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6\": container with ID starting with 73dc22972d04883744ba84b90db848a51e6deeafbbfadc346400c49ddcd33dc6 not found: ID does not exist" Apr 16 22:17:21.427660 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.427635 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-655c56db74-ht6dw"] Apr 16 22:17:21.434596 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.434574 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-655c56db74-ht6dw"] Apr 16 22:17:21.679982 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:21.677690 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49536ae0-0390-4157-8cd9-1c2a5e86badf" path="/var/lib/kubelet/pods/49536ae0-0390-4157-8cd9-1c2a5e86badf/volumes" Apr 16 22:17:39.562708 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:39.562670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:17:39.564969 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:39.564951 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d7d39e-d19f-4a6e-8107-593903f29181-metrics-certs\") pod \"network-metrics-daemon-qgfjd\" (UID: \"a2d7d39e-d19f-4a6e-8107-593903f29181\") " pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:17:39.675761 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:39.675718 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gj5qm\"" Apr 16 22:17:39.683218 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:39.683197 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qgfjd" Apr 16 22:17:39.800218 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:39.799886 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qgfjd"] Apr 16 22:17:39.802039 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:17:39.802012 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d7d39e_d19f_4a6e_8107_593903f29181.slice/crio-5faa0e88cbba19a76a0791afcb05ac8b8f1a13550fbfe22f53b29afaed701bf1 WatchSource:0}: Error finding container 5faa0e88cbba19a76a0791afcb05ac8b8f1a13550fbfe22f53b29afaed701bf1: Status 404 returned error can't find the container with id 5faa0e88cbba19a76a0791afcb05ac8b8f1a13550fbfe22f53b29afaed701bf1 Apr 16 22:17:40.459062 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:40.458981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qgfjd" event={"ID":"a2d7d39e-d19f-4a6e-8107-593903f29181","Type":"ContainerStarted","Data":"5faa0e88cbba19a76a0791afcb05ac8b8f1a13550fbfe22f53b29afaed701bf1"} Apr 16 22:17:41.467129 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:41.467092 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qgfjd" event={"ID":"a2d7d39e-d19f-4a6e-8107-593903f29181","Type":"ContainerStarted","Data":"5607db5aa456673eb3789359d855f31d4e5b747d0559bf2129bed90995b050b3"} Apr 16 22:17:41.467129 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:41.467128 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qgfjd" event={"ID":"a2d7d39e-d19f-4a6e-8107-593903f29181","Type":"ContainerStarted","Data":"ad09c1f3c6571e858a61c177855c05d11d47d9502fceb484755440a2e3b0fe59"} Apr 16 22:17:41.482881 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:17:41.482833 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qgfjd" podStartSLOduration=253.589838748 podStartE2EDuration="4m14.482820268s" podCreationTimestamp="2026-04-16 22:13:27 +0000 UTC" firstStartedPulling="2026-04-16 22:17:39.804014894 +0000 UTC m=+252.649783790" lastFinishedPulling="2026-04-16 22:17:40.69699642 +0000 UTC m=+253.542765310" observedRunningTime="2026-04-16 22:17:41.481044722 +0000 UTC m=+254.326813634" watchObservedRunningTime="2026-04-16 22:17:41.482820268 +0000 UTC m=+254.328589159" Apr 16 22:18:04.304532 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.304496 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-85c96466d6-vt5bm"] Apr 16 22:18:04.304943 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.304842 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49536ae0-0390-4157-8cd9-1c2a5e86badf" containerName="console" Apr 16 22:18:04.304943 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.304852 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="49536ae0-0390-4157-8cd9-1c2a5e86badf" containerName="console" Apr 16 22:18:04.304943 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.304926 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="49536ae0-0390-4157-8cd9-1c2a5e86badf" containerName="console" Apr 16 22:18:04.309312 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.309290 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.312189 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.312165 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 22:18:04.312320 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.312296 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 22:18:04.312374 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.312299 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 22:18:04.312374 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.312299 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 22:18:04.312465 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.312307 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-tx2hn\"" Apr 16 22:18:04.312668 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.312654 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 22:18:04.317288 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.317265 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 22:18:04.322647 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.322625 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-85c96466d6-vt5bm"] Apr 16 22:18:04.461840 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.461811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.462000 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.461879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-serving-certs-ca-bundle\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.462000 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.461906 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-federate-client-tls\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.462000 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.461925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-metrics-client-ca\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.462000 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.461953 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-telemeter-client-tls\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.462130 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.462010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.462130 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.462036 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttljq\" (UniqueName: \"kubernetes.io/projected/f2484156-d744-43aa-8be7-f03249b65d82-kube-api-access-ttljq\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.462130 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.462068 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-secret-telemeter-client\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.563441 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.563355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-telemeter-client-tls\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.563441 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.563409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.563441 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.563434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttljq\" (UniqueName: \"kubernetes.io/projected/f2484156-d744-43aa-8be7-f03249b65d82-kube-api-access-ttljq\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.563706 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.563471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-secret-telemeter-client\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.563706 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.563512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.563706 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.563594 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-serving-certs-ca-bundle\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.563706 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.563630 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-federate-client-tls\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.563706 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.563689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-metrics-client-ca\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.564408 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.564383 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-metrics-client-ca\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.564562 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.564539 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-serving-certs-ca-bundle\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.564646 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.564629 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2484156-d744-43aa-8be7-f03249b65d82-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.566142 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.566114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-federate-client-tls\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.566236 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.566177 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-telemeter-client-tls\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.566339 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.566317 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.566497 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.566484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f2484156-d744-43aa-8be7-f03249b65d82-secret-telemeter-client\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.574827 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.574798 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttljq\" (UniqueName: \"kubernetes.io/projected/f2484156-d744-43aa-8be7-f03249b65d82-kube-api-access-ttljq\") pod \"telemeter-client-85c96466d6-vt5bm\" (UID: \"f2484156-d744-43aa-8be7-f03249b65d82\") " pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.618888 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.618869 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" Apr 16 22:18:04.739453 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:04.739422 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-85c96466d6-vt5bm"] Apr 16 22:18:04.742653 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:18:04.742629 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2484156_d744_43aa_8be7_f03249b65d82.slice/crio-cbd878cd6adcd613ae8f05d627a832c965ccf92547b3847d1cc34f8a8bf4c260 WatchSource:0}: Error finding container cbd878cd6adcd613ae8f05d627a832c965ccf92547b3847d1cc34f8a8bf4c260: Status 404 returned error can't find the container with id cbd878cd6adcd613ae8f05d627a832c965ccf92547b3847d1cc34f8a8bf4c260 Apr 16 22:18:05.540239 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:05.540195 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" event={"ID":"f2484156-d744-43aa-8be7-f03249b65d82","Type":"ContainerStarted","Data":"cbd878cd6adcd613ae8f05d627a832c965ccf92547b3847d1cc34f8a8bf4c260"} Apr 16 22:18:06.088467 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:18:06.088420 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rpbsg" podUID="7b7fa80a-7e5b-4b14-8792-ff01bd1f2143" Apr 16 22:18:06.544674 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:06.544638 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" event={"ID":"f2484156-d744-43aa-8be7-f03249b65d82","Type":"ContainerStarted","Data":"4c89cd2672cb3bc55f84fb30c5ace7ca65c73dc1cb3997bca482a690167f2d73"} Apr 16 22:18:06.545093 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:06.544678 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" event={"ID":"f2484156-d744-43aa-8be7-f03249b65d82","Type":"ContainerStarted","Data":"494c330bff0f803462dc6f17ebed75a371bce3ba26bd87613d40a97de3291ac8"} Apr 16 22:18:06.545093 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:06.544691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" event={"ID":"f2484156-d744-43aa-8be7-f03249b65d82","Type":"ContainerStarted","Data":"43ea58198b72788bbcb05536f92da8da5a5214532c592b6ed524b25b00879f21"} Apr 16 22:18:06.545093 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:06.544785 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpbsg" Apr 16 22:18:06.566861 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:06.566818 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-85c96466d6-vt5bm" podStartSLOduration=1.00550282 podStartE2EDuration="2.566806412s" podCreationTimestamp="2026-04-16 22:18:04 +0000 UTC" firstStartedPulling="2026-04-16 22:18:04.744378497 +0000 UTC m=+277.590147388" lastFinishedPulling="2026-04-16 22:18:06.305682089 +0000 UTC m=+279.151450980" observedRunningTime="2026-04-16 22:18:06.564791886 +0000 UTC m=+279.410560798" watchObservedRunningTime="2026-04-16 22:18:06.566806412 +0000 UTC m=+279.412575323" Apr 16 22:18:10.009071 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.009039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:18:10.009427 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.009080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:18:10.011362 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.011333 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b7fa80a-7e5b-4b14-8792-ff01bd1f2143-metrics-tls\") pod \"dns-default-rpbsg\" (UID: \"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143\") " pod="openshift-dns/dns-default-rpbsg" Apr 16 22:18:10.011617 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.011595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed1be888-8420-4861-992a-ffd27fc02a14-cert\") pod \"ingress-canary-sfffv\" (UID: \"ed1be888-8420-4861-992a-ffd27fc02a14\") " pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:18:10.075358 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.075334 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bd46\"" Apr 16 22:18:10.083318 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.083301 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sfffv" Apr 16 22:18:10.148777 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.148755 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6fpb6\"" Apr 16 22:18:10.158645 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.156711 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpbsg" Apr 16 22:18:10.204215 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.204185 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sfffv"] Apr 16 22:18:10.205992 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:18:10.205960 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded1be888_8420_4861_992a_ffd27fc02a14.slice/crio-5e5c02a38c308c1cefef25035ffa5ad63da079686966c783ed1bfd206ba142f3 WatchSource:0}: Error finding container 5e5c02a38c308c1cefef25035ffa5ad63da079686966c783ed1bfd206ba142f3: Status 404 returned error can't find the container with id 5e5c02a38c308c1cefef25035ffa5ad63da079686966c783ed1bfd206ba142f3 Apr 16 22:18:10.282030 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.282006 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rpbsg"] Apr 16 22:18:10.284324 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:18:10.284296 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b7fa80a_7e5b_4b14_8792_ff01bd1f2143.slice/crio-b63700225a69a5f9bb2243fc1634d06de37b4aa0c56420da5ad08e93e06800b6 WatchSource:0}: Error finding container b63700225a69a5f9bb2243fc1634d06de37b4aa0c56420da5ad08e93e06800b6: Status 404 returned error can't find the container with id b63700225a69a5f9bb2243fc1634d06de37b4aa0c56420da5ad08e93e06800b6 Apr 16 22:18:10.556522 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.556425 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sfffv" event={"ID":"ed1be888-8420-4861-992a-ffd27fc02a14","Type":"ContainerStarted","Data":"5e5c02a38c308c1cefef25035ffa5ad63da079686966c783ed1bfd206ba142f3"} Apr 16 22:18:10.557350 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:10.557327 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpbsg" event={"ID":"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143","Type":"ContainerStarted","Data":"b63700225a69a5f9bb2243fc1634d06de37b4aa0c56420da5ad08e93e06800b6"} Apr 16 22:18:12.565572 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:12.565535 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sfffv" event={"ID":"ed1be888-8420-4861-992a-ffd27fc02a14","Type":"ContainerStarted","Data":"a67ea1413182bc8f4675ef425c0d87bb9399e321b4e5f9ea9707e2a18f8a93d0"} Apr 16 22:18:12.567099 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:12.567076 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpbsg" event={"ID":"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143","Type":"ContainerStarted","Data":"51c9409f9bc2db1479cc439130e89868a4fd7dc8030680a32aaa29ed295be7ab"} Apr 16 22:18:12.567198 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:12.567103 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpbsg" event={"ID":"7b7fa80a-7e5b-4b14-8792-ff01bd1f2143","Type":"ContainerStarted","Data":"c472a97a892906d0f40d1ebdd1c51934b65618bcf0f2dbd74bcd595e7e96bfb0"} Apr 16 22:18:12.567198 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:12.567132 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rpbsg" Apr 16 22:18:12.580162 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:12.580116 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sfffv" podStartSLOduration=251.777549527 podStartE2EDuration="4m13.580103101s" podCreationTimestamp="2026-04-16 22:13:59 +0000 UTC" firstStartedPulling="2026-04-16 22:18:10.207787172 +0000 UTC m=+283.053556061" lastFinishedPulling="2026-04-16 22:18:12.010340741 +0000 UTC m=+284.856109635" observedRunningTime="2026-04-16 22:18:12.57967587 +0000 UTC m=+285.425444783" watchObservedRunningTime="2026-04-16 22:18:12.580103101 +0000 UTC m=+285.425872015" Apr 16 22:18:12.596747 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:12.596688 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rpbsg" podStartSLOduration=251.876674344 podStartE2EDuration="4m13.596676699s" podCreationTimestamp="2026-04-16 22:13:59 +0000 UTC" firstStartedPulling="2026-04-16 22:18:10.28613393 +0000 UTC m=+283.131902822" lastFinishedPulling="2026-04-16 22:18:12.006136285 +0000 UTC m=+284.851905177" observedRunningTime="2026-04-16 22:18:12.595167622 +0000 UTC m=+285.440936535" watchObservedRunningTime="2026-04-16 22:18:12.596676699 +0000 UTC m=+285.442445610" Apr 16 22:18:22.572244 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:22.572213 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rpbsg" Apr 16 22:18:27.605886 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:27.605851 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:18:27.606652 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:27.606605 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:18:27.610122 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:27.610099 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:18:27.610783 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:27.610765 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:18:27.616860 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:27.616843 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:18:28.450804 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:28.450774 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb55f8c68-f7znq"] Apr 16 22:18:53.470397 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.470335 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cb55f8c68-f7znq" podUID="4a27d86a-93d5-49db-91e6-4ccb8863b1d9" containerName="console" containerID="cri-o://a51f1d20c5d561fb974ac4f4aebfb746ee20ea7b2412585720c9d19aa05904c5" gracePeriod=15 Apr 16 22:18:53.700613 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.700585 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb55f8c68-f7znq_4a27d86a-93d5-49db-91e6-4ccb8863b1d9/console/0.log" Apr 16 22:18:53.700751 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.700661 2579 generic.go:358] "Generic (PLEG): container finished" podID="4a27d86a-93d5-49db-91e6-4ccb8863b1d9" containerID="a51f1d20c5d561fb974ac4f4aebfb746ee20ea7b2412585720c9d19aa05904c5" exitCode=2 Apr 16 22:18:53.700751 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.700720 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb55f8c68-f7znq" event={"ID":"4a27d86a-93d5-49db-91e6-4ccb8863b1d9","Type":"ContainerDied","Data":"a51f1d20c5d561fb974ac4f4aebfb746ee20ea7b2412585720c9d19aa05904c5"} Apr 16 22:18:53.712822 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.712805 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb55f8c68-f7znq_4a27d86a-93d5-49db-91e6-4ccb8863b1d9/console/0.log" Apr 16 22:18:53.712913 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.712857 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:18:53.850843 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.850818 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-trusted-ca-bundle\") pod \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " Apr 16 22:18:53.851010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.850848 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlhqq\" (UniqueName: \"kubernetes.io/projected/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-kube-api-access-hlhqq\") pod \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " Apr 16 22:18:53.851010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.850868 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-oauth-serving-cert\") pod \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " Apr 16 22:18:53.851010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.850895 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-config\") pod \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " Apr 16 22:18:53.851010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.850922 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-serving-cert\") pod \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " Apr 16 22:18:53.851010 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.850997 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-oauth-config\") pod \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " Apr 16 22:18:53.851340 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.851049 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-service-ca\") pod \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\" (UID: \"4a27d86a-93d5-49db-91e6-4ccb8863b1d9\") " Apr 16 22:18:53.851340 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.851276 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4a27d86a-93d5-49db-91e6-4ccb8863b1d9" (UID: "4a27d86a-93d5-49db-91e6-4ccb8863b1d9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:53.851340 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.851288 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a27d86a-93d5-49db-91e6-4ccb8863b1d9" (UID: "4a27d86a-93d5-49db-91e6-4ccb8863b1d9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:53.851499 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.851461 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-config" (OuterVolumeSpecName: "console-config") pod "4a27d86a-93d5-49db-91e6-4ccb8863b1d9" (UID: "4a27d86a-93d5-49db-91e6-4ccb8863b1d9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:53.851499 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.851488 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a27d86a-93d5-49db-91e6-4ccb8863b1d9" (UID: "4a27d86a-93d5-49db-91e6-4ccb8863b1d9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:53.853148 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.853116 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4a27d86a-93d5-49db-91e6-4ccb8863b1d9" (UID: "4a27d86a-93d5-49db-91e6-4ccb8863b1d9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:53.853256 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.853175 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a27d86a-93d5-49db-91e6-4ccb8863b1d9" (UID: "4a27d86a-93d5-49db-91e6-4ccb8863b1d9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:53.853256 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.853178 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-kube-api-access-hlhqq" (OuterVolumeSpecName: "kube-api-access-hlhqq") pod "4a27d86a-93d5-49db-91e6-4ccb8863b1d9" (UID: "4a27d86a-93d5-49db-91e6-4ccb8863b1d9"). InnerVolumeSpecName "kube-api-access-hlhqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:18:53.951667 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.951640 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-trusted-ca-bundle\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:18:53.951667 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.951665 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hlhqq\" (UniqueName: \"kubernetes.io/projected/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-kube-api-access-hlhqq\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:18:53.951855 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.951675 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-oauth-serving-cert\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:18:53.951855 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.951684 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-config\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:18:53.951855 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.951693 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-serving-cert\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:18:53.951855 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.951702 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-console-oauth-config\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:18:53.951855 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:53.951710 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a27d86a-93d5-49db-91e6-4ccb8863b1d9-service-ca\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:18:54.704556 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:54.704521 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb55f8c68-f7znq_4a27d86a-93d5-49db-91e6-4ccb8863b1d9/console/0.log" Apr 16 22:18:54.705040 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:54.704597 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb55f8c68-f7znq" event={"ID":"4a27d86a-93d5-49db-91e6-4ccb8863b1d9","Type":"ContainerDied","Data":"c90d5283f7fad872c77233477843beaff50a9e2443e7c52c59b7869f5a411de4"} Apr 16 22:18:54.705040 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:54.704626 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb55f8c68-f7znq" Apr 16 22:18:54.705040 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:54.704636 2579 scope.go:117] "RemoveContainer" containerID="a51f1d20c5d561fb974ac4f4aebfb746ee20ea7b2412585720c9d19aa05904c5" Apr 16 22:18:54.725913 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:54.725890 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb55f8c68-f7znq"] Apr 16 22:18:54.732475 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:54.732454 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cb55f8c68-f7znq"] Apr 16 22:18:55.676669 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:55.676636 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a27d86a-93d5-49db-91e6-4ccb8863b1d9" path="/var/lib/kubelet/pods/4a27d86a-93d5-49db-91e6-4ccb8863b1d9/volumes" Apr 16 22:18:56.948949 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.948917 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6"] Apr 16 22:18:56.949343 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.949238 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a27d86a-93d5-49db-91e6-4ccb8863b1d9" containerName="console" Apr 16 22:18:56.949343 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.949248 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a27d86a-93d5-49db-91e6-4ccb8863b1d9" containerName="console" Apr 16 22:18:56.949343 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.949316 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a27d86a-93d5-49db-91e6-4ccb8863b1d9" containerName="console" Apr 16 22:18:56.953788 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.953770 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:56.956355 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.956333 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:18:56.956454 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.956363 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:18:56.957425 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.957409 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-nn5d2\"" Apr 16 22:18:56.960376 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:56.960354 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6"] Apr 16 22:18:57.078649 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.078624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.078788 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.078654 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8j24\" (UniqueName: \"kubernetes.io/projected/e2e456f8-ce67-4145-a12a-ef5749e072e8-kube-api-access-m8j24\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.078788 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.078772 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.179235 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.179209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.179357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.179259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.179357 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.179288 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8j24\" (UniqueName: \"kubernetes.io/projected/e2e456f8-ce67-4145-a12a-ef5749e072e8-kube-api-access-m8j24\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.179584 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.179568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.179632 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.179617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.188243 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.188218 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8j24\" (UniqueName: \"kubernetes.io/projected/e2e456f8-ce67-4145-a12a-ef5749e072e8-kube-api-access-m8j24\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.263704 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.263684 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:18:57.395231 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.395211 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6"] Apr 16 22:18:57.397757 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:18:57.397707 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e456f8_ce67_4145_a12a_ef5749e072e8.slice/crio-a9125cb485c0fd17d3d77b740896a3077cbfab4205c950d023465689e20c31a4 WatchSource:0}: Error finding container a9125cb485c0fd17d3d77b740896a3077cbfab4205c950d023465689e20c31a4: Status 404 returned error can't find the container with id a9125cb485c0fd17d3d77b740896a3077cbfab4205c950d023465689e20c31a4 Apr 16 22:18:57.399414 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.399398 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:18:57.719012 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:18:57.718922 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" event={"ID":"e2e456f8-ce67-4145-a12a-ef5749e072e8","Type":"ContainerStarted","Data":"a9125cb485c0fd17d3d77b740896a3077cbfab4205c950d023465689e20c31a4"} Apr 16 22:19:02.745267 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:02.745232 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerID="4d763b3eb2b88f9c5aab6ded8f38bba499e279dfe31eed0224a275edf5300b62" exitCode=0 Apr 16 22:19:02.745720 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:02.745278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" event={"ID":"e2e456f8-ce67-4145-a12a-ef5749e072e8","Type":"ContainerDied","Data":"4d763b3eb2b88f9c5aab6ded8f38bba499e279dfe31eed0224a275edf5300b62"} Apr 16 22:19:05.756290 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:05.756260 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerID="1ad957817817e3a3f5faed1c2e13c4512dc63377c142d3b6853f4126b28b8949" exitCode=0 Apr 16 22:19:05.756652 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:05.756326 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" event={"ID":"e2e456f8-ce67-4145-a12a-ef5749e072e8","Type":"ContainerDied","Data":"1ad957817817e3a3f5faed1c2e13c4512dc63377c142d3b6853f4126b28b8949"} Apr 16 22:19:11.777937 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:11.777907 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerID="19d7787d4c97a38fda83f856b7e53d8b1b0c5e492622c4ea4ee1933d6acfd4e1" exitCode=0 Apr 16 22:19:11.778281 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:11.777993 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" event={"ID":"e2e456f8-ce67-4145-a12a-ef5749e072e8","Type":"ContainerDied","Data":"19d7787d4c97a38fda83f856b7e53d8b1b0c5e492622c4ea4ee1933d6acfd4e1"} Apr 16 22:19:12.904066 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:12.904043 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:19:12.911668 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:12.911650 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-util\") pod \"e2e456f8-ce67-4145-a12a-ef5749e072e8\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " Apr 16 22:19:12.911758 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:12.911700 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-bundle\") pod \"e2e456f8-ce67-4145-a12a-ef5749e072e8\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " Apr 16 22:19:12.911758 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:12.911750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8j24\" (UniqueName: \"kubernetes.io/projected/e2e456f8-ce67-4145-a12a-ef5749e072e8-kube-api-access-m8j24\") pod \"e2e456f8-ce67-4145-a12a-ef5749e072e8\" (UID: \"e2e456f8-ce67-4145-a12a-ef5749e072e8\") " Apr 16 22:19:12.912256 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:12.912229 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-bundle" (OuterVolumeSpecName: "bundle") pod "e2e456f8-ce67-4145-a12a-ef5749e072e8" (UID: "e2e456f8-ce67-4145-a12a-ef5749e072e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:19:12.913811 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:12.913784 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e456f8-ce67-4145-a12a-ef5749e072e8-kube-api-access-m8j24" (OuterVolumeSpecName: "kube-api-access-m8j24") pod "e2e456f8-ce67-4145-a12a-ef5749e072e8" (UID: "e2e456f8-ce67-4145-a12a-ef5749e072e8"). InnerVolumeSpecName "kube-api-access-m8j24". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:19:12.915967 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:12.915941 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-util" (OuterVolumeSpecName: "util") pod "e2e456f8-ce67-4145-a12a-ef5749e072e8" (UID: "e2e456f8-ce67-4145-a12a-ef5749e072e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:19:13.013268 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:13.013221 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-util\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:19:13.013268 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:13.013263 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2e456f8-ce67-4145-a12a-ef5749e072e8-bundle\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:19:13.013268 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:13.013273 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8j24\" (UniqueName: \"kubernetes.io/projected/e2e456f8-ce67-4145-a12a-ef5749e072e8-kube-api-access-m8j24\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:19:13.785740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:13.785692 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" event={"ID":"e2e456f8-ce67-4145-a12a-ef5749e072e8","Type":"ContainerDied","Data":"a9125cb485c0fd17d3d77b740896a3077cbfab4205c950d023465689e20c31a4"} Apr 16 22:19:13.785740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:13.785705 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c97bv6" Apr 16 22:19:13.785740 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:13.785748 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9125cb485c0fd17d3d77b740896a3077cbfab4205c950d023465689e20c31a4" Apr 16 22:19:23.434179 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.434142 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x"] Apr 16 22:19:23.434614 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.434530 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerName="util" Apr 16 22:19:23.434614 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.434547 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerName="util" Apr 16 22:19:23.434614 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.434570 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerName="pull" Apr 16 22:19:23.434614 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.434575 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerName="pull" Apr 16 22:19:23.434614 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.434582 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerName="extract" Apr 16 22:19:23.434614 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.434587 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerName="extract" Apr 16 22:19:23.434931 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.434647 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2e456f8-ce67-4145-a12a-ef5749e072e8" containerName="extract" Apr 16 22:19:23.441134 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.441117 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.443869 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.443841 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 22:19:23.444115 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.444095 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 22:19:23.444221 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.444203 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 22:19:23.444316 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.444293 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rkcdv\"" Apr 16 22:19:23.445137 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.445119 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 22:19:23.445346 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.445142 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 22:19:23.447021 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.447001 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x"] Apr 16 22:19:23.495603 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.495582 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.495764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.495612 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e9eca10a-eae5-499a-a76a-981dd3a9a457-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.495764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.495630 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdz4\" (UniqueName: \"kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-kube-api-access-hqdz4\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.596623 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.596599 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.596764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.596631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e9eca10a-eae5-499a-a76a-981dd3a9a457-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.596764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.596649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdz4\" (UniqueName: \"kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-kube-api-access-hqdz4\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.596860 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:19:23.596816 2579 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:19:23.596860 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:19:23.596832 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:19:23.596860 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:19:23.596847 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x: references non-existent secret key: tls.crt Apr 16 22:19:23.596955 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:19:23.596907 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-certificates podName:e9eca10a-eae5-499a-a76a-981dd3a9a457 nodeName:}" failed. No retries permitted until 2026-04-16 22:19:24.096887589 +0000 UTC m=+356.942656484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-certificates") pod "keda-metrics-apiserver-7c9f485588-9j66x" (UID: "e9eca10a-eae5-499a-a76a-981dd3a9a457") : references non-existent secret key: tls.crt Apr 16 22:19:23.596999 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.596981 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e9eca10a-eae5-499a-a76a-981dd3a9a457-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.606259 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.606235 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdz4\" (UniqueName: \"kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-kube-api-access-hqdz4\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:23.727130 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.727102 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-f6zz5"] Apr 16 22:19:23.731137 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.731121 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:23.734084 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.734063 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 22:19:23.739602 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.739582 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-f6zz5"] Apr 16 22:19:23.799158 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.799135 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfmvl\" (UniqueName: \"kubernetes.io/projected/f1a500e0-a69e-44eb-a5f8-28fa995cf015-kube-api-access-xfmvl\") pod \"keda-admission-cf49989db-f6zz5\" (UID: \"f1a500e0-a69e-44eb-a5f8-28fa995cf015\") " pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:23.799269 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.799181 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1a500e0-a69e-44eb-a5f8-28fa995cf015-certificates\") pod \"keda-admission-cf49989db-f6zz5\" (UID: \"f1a500e0-a69e-44eb-a5f8-28fa995cf015\") " pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:23.900532 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.900506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfmvl\" (UniqueName: \"kubernetes.io/projected/f1a500e0-a69e-44eb-a5f8-28fa995cf015-kube-api-access-xfmvl\") pod \"keda-admission-cf49989db-f6zz5\" (UID: \"f1a500e0-a69e-44eb-a5f8-28fa995cf015\") " pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:23.900682 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.900543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1a500e0-a69e-44eb-a5f8-28fa995cf015-certificates\") pod \"keda-admission-cf49989db-f6zz5\" (UID: \"f1a500e0-a69e-44eb-a5f8-28fa995cf015\") " pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:23.902796 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.902776 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1a500e0-a69e-44eb-a5f8-28fa995cf015-certificates\") pod \"keda-admission-cf49989db-f6zz5\" (UID: \"f1a500e0-a69e-44eb-a5f8-28fa995cf015\") " pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:23.909013 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:23.908991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfmvl\" (UniqueName: \"kubernetes.io/projected/f1a500e0-a69e-44eb-a5f8-28fa995cf015-kube-api-access-xfmvl\") pod \"keda-admission-cf49989db-f6zz5\" (UID: \"f1a500e0-a69e-44eb-a5f8-28fa995cf015\") " pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:24.043520 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:24.043448 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:24.102589 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:24.102553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:24.104914 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:24.104890 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e9eca10a-eae5-499a-a76a-981dd3a9a457-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9j66x\" (UID: \"e9eca10a-eae5-499a-a76a-981dd3a9a457\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:24.170384 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:24.170358 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-f6zz5"] Apr 16 22:19:24.181175 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:19:24.181149 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a500e0_a69e_44eb_a5f8_28fa995cf015.slice/crio-113f66683b1952ee70decd1c7b54f77f098e29079659cff75bedd0fa813803b7 WatchSource:0}: Error finding container 113f66683b1952ee70decd1c7b54f77f098e29079659cff75bedd0fa813803b7: Status 404 returned error can't find the container with id 113f66683b1952ee70decd1c7b54f77f098e29079659cff75bedd0fa813803b7 Apr 16 22:19:24.352310 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:24.352236 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:24.471502 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:24.471384 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x"] Apr 16 22:19:24.474026 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:19:24.474003 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9eca10a_eae5_499a_a76a_981dd3a9a457.slice/crio-ba562d9828c7337fad5fba2324247e5976753b75e75c10cabb2f9de9724254e4 WatchSource:0}: Error finding container ba562d9828c7337fad5fba2324247e5976753b75e75c10cabb2f9de9724254e4: Status 404 returned error can't find the container with id ba562d9828c7337fad5fba2324247e5976753b75e75c10cabb2f9de9724254e4 Apr 16 22:19:24.819741 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:24.819699 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-f6zz5" event={"ID":"f1a500e0-a69e-44eb-a5f8-28fa995cf015","Type":"ContainerStarted","Data":"113f66683b1952ee70decd1c7b54f77f098e29079659cff75bedd0fa813803b7"} Apr 16 22:19:24.820753 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:24.820708 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" event={"ID":"e9eca10a-eae5-499a-a76a-981dd3a9a457","Type":"ContainerStarted","Data":"ba562d9828c7337fad5fba2324247e5976753b75e75c10cabb2f9de9724254e4"} Apr 16 22:19:25.825304 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:25.825268 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-f6zz5" event={"ID":"f1a500e0-a69e-44eb-a5f8-28fa995cf015","Type":"ContainerStarted","Data":"fc50f41b502cf44db28f97777869c866ed5eaa7f37abf075dfa0bc6795e93b09"} Apr 16 22:19:25.825749 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:25.825375 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:19:25.842079 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:25.842024 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-f6zz5" podStartSLOduration=1.412143755 podStartE2EDuration="2.842005982s" podCreationTimestamp="2026-04-16 22:19:23 +0000 UTC" firstStartedPulling="2026-04-16 22:19:24.182852098 +0000 UTC m=+357.028620988" lastFinishedPulling="2026-04-16 22:19:25.61271431 +0000 UTC m=+358.458483215" observedRunningTime="2026-04-16 22:19:25.841076432 +0000 UTC m=+358.686845345" watchObservedRunningTime="2026-04-16 22:19:25.842005982 +0000 UTC m=+358.687774896" Apr 16 22:19:27.834316 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:27.834286 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" event={"ID":"e9eca10a-eae5-499a-a76a-981dd3a9a457","Type":"ContainerStarted","Data":"0a8b37b7a72fb114d7fd014d9425c10d93bd8b04e8023322875e964559126c76"} Apr 16 22:19:27.834785 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:27.834455 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:27.853437 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:27.853391 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" podStartSLOduration=2.070898683 podStartE2EDuration="4.853376191s" podCreationTimestamp="2026-04-16 22:19:23 +0000 UTC" firstStartedPulling="2026-04-16 22:19:24.475500684 +0000 UTC m=+357.321269573" lastFinishedPulling="2026-04-16 22:19:27.257978187 +0000 UTC m=+360.103747081" observedRunningTime="2026-04-16 22:19:27.852598624 +0000 UTC m=+360.698367536" watchObservedRunningTime="2026-04-16 22:19:27.853376191 +0000 UTC m=+360.699145106" Apr 16 22:19:38.843050 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:38.843020 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9j66x" Apr 16 22:19:46.832406 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:19:46.832376 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-f6zz5" Apr 16 22:20:32.406554 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.406522 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6"] Apr 16 22:20:32.409362 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.409341 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:32.412131 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.412111 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:20:32.413070 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.413052 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 22:20:32.413180 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.413084 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:20:32.413180 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.413109 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zj77j\"" Apr 16 22:20:32.420968 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.420945 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6"] Apr 16 22:20:32.513239 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.513207 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85b604ac-4812-41d0-b537-df9b5f259f6a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2t5f6\" (UID: \"85b604ac-4812-41d0-b537-df9b5f259f6a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:32.513398 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.513247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c8qq\" (UniqueName: \"kubernetes.io/projected/85b604ac-4812-41d0-b537-df9b5f259f6a-kube-api-access-5c8qq\") pod \"llmisvc-controller-manager-68cc5db7c4-2t5f6\" (UID: \"85b604ac-4812-41d0-b537-df9b5f259f6a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:32.614699 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.614669 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85b604ac-4812-41d0-b537-df9b5f259f6a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2t5f6\" (UID: \"85b604ac-4812-41d0-b537-df9b5f259f6a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:32.614879 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.614704 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5c8qq\" (UniqueName: \"kubernetes.io/projected/85b604ac-4812-41d0-b537-df9b5f259f6a-kube-api-access-5c8qq\") pod \"llmisvc-controller-manager-68cc5db7c4-2t5f6\" (UID: \"85b604ac-4812-41d0-b537-df9b5f259f6a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:32.617053 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.617025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85b604ac-4812-41d0-b537-df9b5f259f6a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2t5f6\" (UID: \"85b604ac-4812-41d0-b537-df9b5f259f6a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:32.623135 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.623108 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c8qq\" (UniqueName: \"kubernetes.io/projected/85b604ac-4812-41d0-b537-df9b5f259f6a-kube-api-access-5c8qq\") pod \"llmisvc-controller-manager-68cc5db7c4-2t5f6\" (UID: \"85b604ac-4812-41d0-b537-df9b5f259f6a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:32.720944 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.720865 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:32.844535 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:32.844499 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6"] Apr 16 22:20:33.066147 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:33.066101 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" event={"ID":"85b604ac-4812-41d0-b537-df9b5f259f6a","Type":"ContainerStarted","Data":"8fb8602becfb8f00f2579a2cfec96498a6f2dbbda19c7c9b9ec6a0e00e966cf0"} Apr 16 22:20:35.074477 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:35.074442 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" event={"ID":"85b604ac-4812-41d0-b537-df9b5f259f6a","Type":"ContainerStarted","Data":"14e77018dc993cd8c6db908a22ba468959f22349939e03795b637c883b343790"} Apr 16 22:20:35.074864 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:35.074653 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:20:35.092996 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:20:35.092955 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" podStartSLOduration=1.369473029 podStartE2EDuration="3.09294391s" podCreationTimestamp="2026-04-16 22:20:32 +0000 UTC" firstStartedPulling="2026-04-16 22:20:32.851322731 +0000 UTC m=+425.697091621" lastFinishedPulling="2026-04-16 22:20:34.574793597 +0000 UTC m=+427.420562502" observedRunningTime="2026-04-16 22:20:35.091805987 +0000 UTC m=+427.937574902" watchObservedRunningTime="2026-04-16 22:20:35.09294391 +0000 UTC m=+427.938712822" Apr 16 22:21:06.082676 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:06.082648 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2t5f6" Apr 16 22:21:40.716391 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.716357 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-tmw2j"] Apr 16 22:21:40.719837 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.719820 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:40.722525 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.722494 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-c9wnq\"" Apr 16 22:21:40.722525 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.722520 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 22:21:40.728394 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.728366 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-tmw2j"] Apr 16 22:21:40.748719 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.748688 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-cert\") pod \"odh-model-controller-696fc77849-tmw2j\" (UID: \"6bfd6dbc-43b3-4753-a985-31d5fd269dd0\") " pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:40.749367 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.749214 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzl4m\" (UniqueName: \"kubernetes.io/projected/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-kube-api-access-mzl4m\") pod \"odh-model-controller-696fc77849-tmw2j\" (UID: \"6bfd6dbc-43b3-4753-a985-31d5fd269dd0\") " pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:40.851191 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.851157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzl4m\" (UniqueName: \"kubernetes.io/projected/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-kube-api-access-mzl4m\") pod \"odh-model-controller-696fc77849-tmw2j\" (UID: \"6bfd6dbc-43b3-4753-a985-31d5fd269dd0\") " pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:40.851346 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.851219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-cert\") pod \"odh-model-controller-696fc77849-tmw2j\" (UID: \"6bfd6dbc-43b3-4753-a985-31d5fd269dd0\") " pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:40.851400 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:21:40.851359 2579 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 22:21:40.851459 ip-10-0-136-39 kubenswrapper[2579]: E0416 22:21:40.851427 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-cert podName:6bfd6dbc-43b3-4753-a985-31d5fd269dd0 nodeName:}" failed. No retries permitted until 2026-04-16 22:21:41.351402408 +0000 UTC m=+494.197171298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-cert") pod "odh-model-controller-696fc77849-tmw2j" (UID: "6bfd6dbc-43b3-4753-a985-31d5fd269dd0") : secret "odh-model-controller-webhook-cert" not found Apr 16 22:21:40.860560 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:40.860528 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzl4m\" (UniqueName: \"kubernetes.io/projected/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-kube-api-access-mzl4m\") pod \"odh-model-controller-696fc77849-tmw2j\" (UID: \"6bfd6dbc-43b3-4753-a985-31d5fd269dd0\") " pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:41.355514 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:41.355486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-cert\") pod \"odh-model-controller-696fc77849-tmw2j\" (UID: \"6bfd6dbc-43b3-4753-a985-31d5fd269dd0\") " pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:41.357863 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:41.357846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfd6dbc-43b3-4753-a985-31d5fd269dd0-cert\") pod \"odh-model-controller-696fc77849-tmw2j\" (UID: \"6bfd6dbc-43b3-4753-a985-31d5fd269dd0\") " pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:41.633462 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:41.633372 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:41.758310 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:41.758284 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-tmw2j"] Apr 16 22:21:41.760643 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:21:41.760618 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bfd6dbc_43b3_4753_a985_31d5fd269dd0.slice/crio-6ddea9a9e5b772ba81ac4466d41a508b6fea23c178e39690e852f4771f217801 WatchSource:0}: Error finding container 6ddea9a9e5b772ba81ac4466d41a508b6fea23c178e39690e852f4771f217801: Status 404 returned error can't find the container with id 6ddea9a9e5b772ba81ac4466d41a508b6fea23c178e39690e852f4771f217801 Apr 16 22:21:42.299972 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:42.299902 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-tmw2j" event={"ID":"6bfd6dbc-43b3-4753-a985-31d5fd269dd0","Type":"ContainerStarted","Data":"6ddea9a9e5b772ba81ac4466d41a508b6fea23c178e39690e852f4771f217801"} Apr 16 22:21:44.318516 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:44.318486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-tmw2j" event={"ID":"6bfd6dbc-43b3-4753-a985-31d5fd269dd0","Type":"ContainerStarted","Data":"8f9e7081a22cf77bcd5643e53b011ac130c1552559a0addd1b4208a0e227aa02"} Apr 16 22:21:44.318910 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:44.318577 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:44.337028 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:44.336702 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-tmw2j" podStartSLOduration=2.025570162 podStartE2EDuration="4.336688775s" podCreationTimestamp="2026-04-16 22:21:40 +0000 UTC" firstStartedPulling="2026-04-16 22:21:41.762254725 +0000 UTC m=+494.608023617" lastFinishedPulling="2026-04-16 22:21:44.073373339 +0000 UTC m=+496.919142230" observedRunningTime="2026-04-16 22:21:44.336419853 +0000 UTC m=+497.182188768" watchObservedRunningTime="2026-04-16 22:21:44.336688775 +0000 UTC m=+497.182457687" Apr 16 22:21:55.323750 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:55.323703 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-tmw2j" Apr 16 22:21:56.124292 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.124257 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-lr6f4"] Apr 16 22:21:56.128020 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.127998 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lr6f4" Apr 16 22:21:56.130369 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.130351 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 22:21:56.130469 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.130353 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-btjqh\"" Apr 16 22:21:56.134105 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.134081 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-lr6f4"] Apr 16 22:21:56.172544 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.172523 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xsqq\" (UniqueName: \"kubernetes.io/projected/3909df36-8a3d-4e72-a2f3-8a59a16ff652-kube-api-access-5xsqq\") pod \"s3-init-lr6f4\" (UID: \"3909df36-8a3d-4e72-a2f3-8a59a16ff652\") " pod="kserve/s3-init-lr6f4" Apr 16 22:21:56.273710 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.273683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xsqq\" (UniqueName: \"kubernetes.io/projected/3909df36-8a3d-4e72-a2f3-8a59a16ff652-kube-api-access-5xsqq\") pod \"s3-init-lr6f4\" (UID: \"3909df36-8a3d-4e72-a2f3-8a59a16ff652\") " pod="kserve/s3-init-lr6f4" Apr 16 22:21:56.282062 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.282042 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xsqq\" (UniqueName: \"kubernetes.io/projected/3909df36-8a3d-4e72-a2f3-8a59a16ff652-kube-api-access-5xsqq\") pod \"s3-init-lr6f4\" (UID: \"3909df36-8a3d-4e72-a2f3-8a59a16ff652\") " pod="kserve/s3-init-lr6f4" Apr 16 22:21:56.444376 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.444308 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lr6f4" Apr 16 22:21:56.561031 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:56.561006 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-lr6f4"] Apr 16 22:21:56.563711 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:21:56.563678 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3909df36_8a3d_4e72_a2f3_8a59a16ff652.slice/crio-06f43917bc8885d0fd2dce8f737816914921b7ff2bdaa9f0ae9e37e142777c7f WatchSource:0}: Error finding container 06f43917bc8885d0fd2dce8f737816914921b7ff2bdaa9f0ae9e37e142777c7f: Status 404 returned error can't find the container with id 06f43917bc8885d0fd2dce8f737816914921b7ff2bdaa9f0ae9e37e142777c7f Apr 16 22:21:57.365051 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:21:57.365005 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lr6f4" event={"ID":"3909df36-8a3d-4e72-a2f3-8a59a16ff652","Type":"ContainerStarted","Data":"06f43917bc8885d0fd2dce8f737816914921b7ff2bdaa9f0ae9e37e142777c7f"} Apr 16 22:22:01.383663 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:01.383627 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lr6f4" event={"ID":"3909df36-8a3d-4e72-a2f3-8a59a16ff652","Type":"ContainerStarted","Data":"eac124d85d18e2e69f623f3ed89ce0ab821b0f93137a82bc286a0d284c341b5c"} Apr 16 22:22:01.399572 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:01.399506 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-lr6f4" podStartSLOduration=1.162228596 podStartE2EDuration="5.399490885s" podCreationTimestamp="2026-04-16 22:21:56 +0000 UTC" firstStartedPulling="2026-04-16 22:21:56.565459223 +0000 UTC m=+509.411228117" lastFinishedPulling="2026-04-16 22:22:00.802721511 +0000 UTC m=+513.648490406" observedRunningTime="2026-04-16 22:22:01.397432114 +0000 UTC m=+514.243201028" watchObservedRunningTime="2026-04-16 22:22:01.399490885 +0000 UTC m=+514.245259796" Apr 16 22:22:04.395953 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:04.395920 2579 generic.go:358] "Generic (PLEG): container finished" podID="3909df36-8a3d-4e72-a2f3-8a59a16ff652" containerID="eac124d85d18e2e69f623f3ed89ce0ab821b0f93137a82bc286a0d284c341b5c" exitCode=0 Apr 16 22:22:04.396311 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:04.395992 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lr6f4" event={"ID":"3909df36-8a3d-4e72-a2f3-8a59a16ff652","Type":"ContainerDied","Data":"eac124d85d18e2e69f623f3ed89ce0ab821b0f93137a82bc286a0d284c341b5c"} Apr 16 22:22:05.535169 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:05.535139 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lr6f4" Apr 16 22:22:05.656271 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:05.656203 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xsqq\" (UniqueName: \"kubernetes.io/projected/3909df36-8a3d-4e72-a2f3-8a59a16ff652-kube-api-access-5xsqq\") pod \"3909df36-8a3d-4e72-a2f3-8a59a16ff652\" (UID: \"3909df36-8a3d-4e72-a2f3-8a59a16ff652\") " Apr 16 22:22:05.658399 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:05.658368 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3909df36-8a3d-4e72-a2f3-8a59a16ff652-kube-api-access-5xsqq" (OuterVolumeSpecName: "kube-api-access-5xsqq") pod "3909df36-8a3d-4e72-a2f3-8a59a16ff652" (UID: "3909df36-8a3d-4e72-a2f3-8a59a16ff652"). InnerVolumeSpecName "kube-api-access-5xsqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:05.757090 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:05.757054 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xsqq\" (UniqueName: \"kubernetes.io/projected/3909df36-8a3d-4e72-a2f3-8a59a16ff652-kube-api-access-5xsqq\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:22:06.404333 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:06.404304 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lr6f4" Apr 16 22:22:06.404333 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:06.404317 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lr6f4" event={"ID":"3909df36-8a3d-4e72-a2f3-8a59a16ff652","Type":"ContainerDied","Data":"06f43917bc8885d0fd2dce8f737816914921b7ff2bdaa9f0ae9e37e142777c7f"} Apr 16 22:22:06.404333 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:06.404342 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06f43917bc8885d0fd2dce8f737816914921b7ff2bdaa9f0ae9e37e142777c7f" Apr 16 22:22:40.443285 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.443199 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-bgx52"] Apr 16 22:22:40.443861 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.443787 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3909df36-8a3d-4e72-a2f3-8a59a16ff652" containerName="s3-init" Apr 16 22:22:40.443861 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.443807 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3909df36-8a3d-4e72-a2f3-8a59a16ff652" containerName="s3-init" Apr 16 22:22:40.443982 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.443940 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3909df36-8a3d-4e72-a2f3-8a59a16ff652" containerName="s3-init" Apr 16 22:22:40.479310 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.479281 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-bgx52"] Apr 16 22:22:40.479310 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.479299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-bgx52" Apr 16 22:22:40.482132 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.482113 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 22:22:40.482283 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.482264 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-btjqh\"" Apr 16 22:22:40.542027 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.542003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p28nh\" (UniqueName: \"kubernetes.io/projected/496c0a23-ddfc-4020-bcc7-0894a11db16c-kube-api-access-p28nh\") pod \"s3-tls-init-custom-bgx52\" (UID: \"496c0a23-ddfc-4020-bcc7-0894a11db16c\") " pod="kserve/s3-tls-init-custom-bgx52" Apr 16 22:22:40.643011 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.642981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p28nh\" (UniqueName: \"kubernetes.io/projected/496c0a23-ddfc-4020-bcc7-0894a11db16c-kube-api-access-p28nh\") pod \"s3-tls-init-custom-bgx52\" (UID: \"496c0a23-ddfc-4020-bcc7-0894a11db16c\") " pod="kserve/s3-tls-init-custom-bgx52" Apr 16 22:22:40.652670 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.652646 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p28nh\" (UniqueName: \"kubernetes.io/projected/496c0a23-ddfc-4020-bcc7-0894a11db16c-kube-api-access-p28nh\") pod \"s3-tls-init-custom-bgx52\" (UID: \"496c0a23-ddfc-4020-bcc7-0894a11db16c\") " pod="kserve/s3-tls-init-custom-bgx52" Apr 16 22:22:40.804384 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.804346 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-bgx52" Apr 16 22:22:40.923209 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:40.923066 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-bgx52"] Apr 16 22:22:40.925521 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:22:40.925496 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod496c0a23_ddfc_4020_bcc7_0894a11db16c.slice/crio-02d83d7f143774116f1d7e4d5f90f8a35b5ec181e58ac1cfeb1aa67f241e6bb8 WatchSource:0}: Error finding container 02d83d7f143774116f1d7e4d5f90f8a35b5ec181e58ac1cfeb1aa67f241e6bb8: Status 404 returned error can't find the container with id 02d83d7f143774116f1d7e4d5f90f8a35b5ec181e58ac1cfeb1aa67f241e6bb8 Apr 16 22:22:41.525767 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:41.525716 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-bgx52" event={"ID":"496c0a23-ddfc-4020-bcc7-0894a11db16c","Type":"ContainerStarted","Data":"ed56fb0927f0c336d736c35f025d21ecbb2989c3cbdf248878fd72d4e8701b07"} Apr 16 22:22:41.526151 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:41.525774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-bgx52" event={"ID":"496c0a23-ddfc-4020-bcc7-0894a11db16c","Type":"ContainerStarted","Data":"02d83d7f143774116f1d7e4d5f90f8a35b5ec181e58ac1cfeb1aa67f241e6bb8"} Apr 16 22:22:41.541658 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:41.541616 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-bgx52" podStartSLOduration=1.541601891 podStartE2EDuration="1.541601891s" podCreationTimestamp="2026-04-16 22:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:22:41.540477979 +0000 UTC m=+554.386246892" watchObservedRunningTime="2026-04-16 22:22:41.541601891 +0000 UTC m=+554.387370803" Apr 16 22:22:45.541795 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:45.541759 2579 generic.go:358] "Generic (PLEG): container finished" podID="496c0a23-ddfc-4020-bcc7-0894a11db16c" containerID="ed56fb0927f0c336d736c35f025d21ecbb2989c3cbdf248878fd72d4e8701b07" exitCode=0 Apr 16 22:22:45.542141 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:45.541823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-bgx52" event={"ID":"496c0a23-ddfc-4020-bcc7-0894a11db16c","Type":"ContainerDied","Data":"ed56fb0927f0c336d736c35f025d21ecbb2989c3cbdf248878fd72d4e8701b07"} Apr 16 22:22:46.672677 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:46.672656 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-bgx52" Apr 16 22:22:46.795598 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:46.795566 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p28nh\" (UniqueName: \"kubernetes.io/projected/496c0a23-ddfc-4020-bcc7-0894a11db16c-kube-api-access-p28nh\") pod \"496c0a23-ddfc-4020-bcc7-0894a11db16c\" (UID: \"496c0a23-ddfc-4020-bcc7-0894a11db16c\") " Apr 16 22:22:46.797542 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:46.797491 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496c0a23-ddfc-4020-bcc7-0894a11db16c-kube-api-access-p28nh" (OuterVolumeSpecName: "kube-api-access-p28nh") pod "496c0a23-ddfc-4020-bcc7-0894a11db16c" (UID: "496c0a23-ddfc-4020-bcc7-0894a11db16c"). InnerVolumeSpecName "kube-api-access-p28nh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:46.896503 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:46.896476 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p28nh\" (UniqueName: \"kubernetes.io/projected/496c0a23-ddfc-4020-bcc7-0894a11db16c-kube-api-access-p28nh\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:22:47.550289 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:47.550262 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-bgx52" Apr 16 22:22:47.550452 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:47.550285 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-bgx52" event={"ID":"496c0a23-ddfc-4020-bcc7-0894a11db16c","Type":"ContainerDied","Data":"02d83d7f143774116f1d7e4d5f90f8a35b5ec181e58ac1cfeb1aa67f241e6bb8"} Apr 16 22:22:47.550452 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:47.550315 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d83d7f143774116f1d7e4d5f90f8a35b5ec181e58ac1cfeb1aa67f241e6bb8" Apr 16 22:22:50.710863 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.710834 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-zjmhj"] Apr 16 22:22:50.711216 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.711185 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="496c0a23-ddfc-4020-bcc7-0894a11db16c" containerName="s3-tls-init-custom" Apr 16 22:22:50.711216 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.711197 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="496c0a23-ddfc-4020-bcc7-0894a11db16c" containerName="s3-tls-init-custom" Apr 16 22:22:50.711287 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.711265 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="496c0a23-ddfc-4020-bcc7-0894a11db16c" containerName="s3-tls-init-custom" Apr 16 22:22:50.714433 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.714413 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-zjmhj" Apr 16 22:22:50.717147 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.717121 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 22:22:50.717247 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.717215 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-btjqh\"" Apr 16 22:22:50.721454 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.721414 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-zjmhj"] Apr 16 22:22:50.828775 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.828746 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vvx\" (UniqueName: \"kubernetes.io/projected/dedb1bb9-2004-41ae-b7b2-adfe73e6914a-kube-api-access-s7vvx\") pod \"s3-tls-init-serving-zjmhj\" (UID: \"dedb1bb9-2004-41ae-b7b2-adfe73e6914a\") " pod="kserve/s3-tls-init-serving-zjmhj" Apr 16 22:22:50.929490 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.929463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7vvx\" (UniqueName: \"kubernetes.io/projected/dedb1bb9-2004-41ae-b7b2-adfe73e6914a-kube-api-access-s7vvx\") pod \"s3-tls-init-serving-zjmhj\" (UID: \"dedb1bb9-2004-41ae-b7b2-adfe73e6914a\") " pod="kserve/s3-tls-init-serving-zjmhj" Apr 16 22:22:50.937720 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:50.937698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7vvx\" (UniqueName: \"kubernetes.io/projected/dedb1bb9-2004-41ae-b7b2-adfe73e6914a-kube-api-access-s7vvx\") pod \"s3-tls-init-serving-zjmhj\" (UID: \"dedb1bb9-2004-41ae-b7b2-adfe73e6914a\") " pod="kserve/s3-tls-init-serving-zjmhj" Apr 16 22:22:51.031086 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:51.031052 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-zjmhj" Apr 16 22:22:51.149345 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:51.149320 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-zjmhj"] Apr 16 22:22:51.151472 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:22:51.151444 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddedb1bb9_2004_41ae_b7b2_adfe73e6914a.slice/crio-a9bb4d5c7ace593eea85640c0ef82ddf04ab80deff185b0bff292905ca4e3f46 WatchSource:0}: Error finding container a9bb4d5c7ace593eea85640c0ef82ddf04ab80deff185b0bff292905ca4e3f46: Status 404 returned error can't find the container with id a9bb4d5c7ace593eea85640c0ef82ddf04ab80deff185b0bff292905ca4e3f46 Apr 16 22:22:51.565508 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:51.565469 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-zjmhj" event={"ID":"dedb1bb9-2004-41ae-b7b2-adfe73e6914a","Type":"ContainerStarted","Data":"1321c3833939d7f834ddcb60be9b7edc4ef56c4659424fea1909abddd1022673"} Apr 16 22:22:51.565508 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:51.565511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-zjmhj" event={"ID":"dedb1bb9-2004-41ae-b7b2-adfe73e6914a","Type":"ContainerStarted","Data":"a9bb4d5c7ace593eea85640c0ef82ddf04ab80deff185b0bff292905ca4e3f46"} Apr 16 22:22:51.585519 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:51.585472 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-zjmhj" podStartSLOduration=1.58545856 podStartE2EDuration="1.58545856s" podCreationTimestamp="2026-04-16 22:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:22:51.582564626 +0000 UTC m=+564.428333538" watchObservedRunningTime="2026-04-16 22:22:51.58545856 +0000 UTC m=+564.431227472" Apr 16 22:22:57.587095 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:57.587061 2579 generic.go:358] "Generic (PLEG): container finished" podID="dedb1bb9-2004-41ae-b7b2-adfe73e6914a" containerID="1321c3833939d7f834ddcb60be9b7edc4ef56c4659424fea1909abddd1022673" exitCode=0 Apr 16 22:22:57.587464 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:57.587135 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-zjmhj" event={"ID":"dedb1bb9-2004-41ae-b7b2-adfe73e6914a","Type":"ContainerDied","Data":"1321c3833939d7f834ddcb60be9b7edc4ef56c4659424fea1909abddd1022673"} Apr 16 22:22:58.713350 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:58.713331 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-zjmhj" Apr 16 22:22:58.797039 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:58.797001 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7vvx\" (UniqueName: \"kubernetes.io/projected/dedb1bb9-2004-41ae-b7b2-adfe73e6914a-kube-api-access-s7vvx\") pod \"dedb1bb9-2004-41ae-b7b2-adfe73e6914a\" (UID: \"dedb1bb9-2004-41ae-b7b2-adfe73e6914a\") " Apr 16 22:22:58.799146 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:58.799116 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dedb1bb9-2004-41ae-b7b2-adfe73e6914a-kube-api-access-s7vvx" (OuterVolumeSpecName: "kube-api-access-s7vvx") pod "dedb1bb9-2004-41ae-b7b2-adfe73e6914a" (UID: "dedb1bb9-2004-41ae-b7b2-adfe73e6914a"). InnerVolumeSpecName "kube-api-access-s7vvx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:58.898778 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:58.898688 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7vvx\" (UniqueName: \"kubernetes.io/projected/dedb1bb9-2004-41ae-b7b2-adfe73e6914a-kube-api-access-s7vvx\") on node \"ip-10-0-136-39.ec2.internal\" DevicePath \"\"" Apr 16 22:22:59.594976 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:59.594948 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-zjmhj" Apr 16 22:22:59.594976 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:59.594971 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-zjmhj" event={"ID":"dedb1bb9-2004-41ae-b7b2-adfe73e6914a","Type":"ContainerDied","Data":"a9bb4d5c7ace593eea85640c0ef82ddf04ab80deff185b0bff292905ca4e3f46"} Apr 16 22:22:59.595182 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:22:59.594998 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9bb4d5c7ace593eea85640c0ef82ddf04ab80deff185b0bff292905ca4e3f46" Apr 16 22:23:14.407204 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.407168 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f459c98f5-q44cn"] Apr 16 22:23:14.407642 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.407562 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dedb1bb9-2004-41ae-b7b2-adfe73e6914a" containerName="s3-tls-init-serving" Apr 16 22:23:14.407642 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.407580 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedb1bb9-2004-41ae-b7b2-adfe73e6914a" containerName="s3-tls-init-serving" Apr 16 22:23:14.407759 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.407664 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="dedb1bb9-2004-41ae-b7b2-adfe73e6914a" containerName="s3-tls-init-serving" Apr 16 22:23:14.410475 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.410455 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.413028 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.412950 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:23:14.413028 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.412973 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:23:14.413306 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.413293 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:23:14.413905 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.413885 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:23:14.414129 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.414112 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:23:14.414252 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.414112 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-v9r8c\"" Apr 16 22:23:14.421138 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.421118 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f459c98f5-q44cn"] Apr 16 22:23:14.421264 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.421246 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:23:14.519202 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.519169 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-console-config\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.519359 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.519216 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c199ff16-213e-4848-bf36-552d5c622b1e-console-oauth-config\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.519359 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.519284 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw7ls\" (UniqueName: \"kubernetes.io/projected/c199ff16-213e-4848-bf36-552d5c622b1e-kube-api-access-qw7ls\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.519442 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.519355 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-oauth-serving-cert\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.519442 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.519395 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c199ff16-213e-4848-bf36-552d5c622b1e-console-serving-cert\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.519442 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.519415 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-service-ca\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.519549 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.519453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-trusted-ca-bundle\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.620478 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.620447 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c199ff16-213e-4848-bf36-552d5c622b1e-console-oauth-config\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.620606 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.620495 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw7ls\" (UniqueName: \"kubernetes.io/projected/c199ff16-213e-4848-bf36-552d5c622b1e-kube-api-access-qw7ls\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.620606 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.620533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-oauth-serving-cert\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.620606 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.620566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c199ff16-213e-4848-bf36-552d5c622b1e-console-serving-cert\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.620606 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.620594 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-service-ca\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.620852 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.620647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-trusted-ca-bundle\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.620852 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.620677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-console-config\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.621396 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.621372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-console-config\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.621492 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.621394 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-service-ca\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.621492 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.621380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-oauth-serving-cert\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.621492 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.621485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c199ff16-213e-4848-bf36-552d5c622b1e-trusted-ca-bundle\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.623094 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.623072 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c199ff16-213e-4848-bf36-552d5c622b1e-console-oauth-config\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.623173 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.623115 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c199ff16-213e-4848-bf36-552d5c622b1e-console-serving-cert\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.629066 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.629048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw7ls\" (UniqueName: \"kubernetes.io/projected/c199ff16-213e-4848-bf36-552d5c622b1e-kube-api-access-qw7ls\") pod \"console-6f459c98f5-q44cn\" (UID: \"c199ff16-213e-4848-bf36-552d5c622b1e\") " pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.721491 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.721417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:14.841798 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:14.841775 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f459c98f5-q44cn"] Apr 16 22:23:14.843987 ip-10-0-136-39 kubenswrapper[2579]: W0416 22:23:14.843957 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc199ff16_213e_4848_bf36_552d5c622b1e.slice/crio-1d1653ece47e7cafd6e0638768f3c43d1a8a366e209fe4757ae47efef7f5da63 WatchSource:0}: Error finding container 1d1653ece47e7cafd6e0638768f3c43d1a8a366e209fe4757ae47efef7f5da63: Status 404 returned error can't find the container with id 1d1653ece47e7cafd6e0638768f3c43d1a8a366e209fe4757ae47efef7f5da63 Apr 16 22:23:15.648199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:15.648162 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f459c98f5-q44cn" event={"ID":"c199ff16-213e-4848-bf36-552d5c622b1e","Type":"ContainerStarted","Data":"a0c27fb194973e1c9fea18d0346ed43ed456e29b1445d72de01269619f8f8979"} Apr 16 22:23:15.648199 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:15.648202 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f459c98f5-q44cn" event={"ID":"c199ff16-213e-4848-bf36-552d5c622b1e","Type":"ContainerStarted","Data":"1d1653ece47e7cafd6e0638768f3c43d1a8a366e209fe4757ae47efef7f5da63"} Apr 16 22:23:15.666264 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:15.666217 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f459c98f5-q44cn" podStartSLOduration=1.66620365 podStartE2EDuration="1.66620365s" podCreationTimestamp="2026-04-16 22:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:23:15.664797545 +0000 UTC m=+588.510566459" watchObservedRunningTime="2026-04-16 22:23:15.66620365 +0000 UTC m=+588.511972559" Apr 16 22:23:24.721926 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:24.721880 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:24.721926 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:24.721934 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:24.726519 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:24.726499 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:25.685438 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:25.685409 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f459c98f5-q44cn" Apr 16 22:23:27.635480 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:27.635451 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:23:27.636208 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:27.636187 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:23:27.639780 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:27.639759 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:23:27.640467 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:23:27.640447 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:28:27.664223 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:28:27.664150 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:28:27.667889 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:28:27.667868 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:28:27.671067 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:28:27.671053 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:28:27.674425 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:28:27.674404 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:33:27.700645 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:33:27.700610 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:33:27.704277 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:33:27.704242 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:33:27.704859 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:33:27.704837 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:33:27.708404 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:33:27.708388 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:38:27.728962 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:38:27.728936 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:38:27.732764 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:38:27.732716 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:38:27.734568 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:38:27.734548 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:38:27.738115 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:38:27.738098 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:43:27.755465 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:43:27.755442 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:43:27.761188 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:43:27.761163 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:43:27.762920 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:43:27.762900 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:43:27.766284 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:43:27.766266 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:48:27.784124 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:48:27.784096 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:48:27.787671 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:48:27.787650 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:48:27.789511 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:48:27.789490 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:48:27.792965 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:48:27.792946 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:53:27.811362 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:53:27.811323 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:53:27.820838 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:53:27.820809 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:53:27.822991 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:53:27.822969 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:53:27.826627 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:53:27.826609 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:58:27.843400 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:58:27.843326 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:58:27.847143 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:58:27.847122 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 22:58:27.851159 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:58:27.851142 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 22:58:27.854843 ip-10-0-136-39 kubenswrapper[2579]: I0416 22:58:27.854828 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:03:27.871870 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:03:27.871843 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:03:27.875664 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:03:27.875642 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:03:27.878941 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:03:27.878922 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:03:27.882420 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:03:27.882406 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:08:27.898434 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:08:27.898393 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:08:27.902228 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:08:27.902208 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:08:27.906699 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:08:27.906677 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:08:27.910064 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:08:27.910050 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:13:27.925480 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:13:27.925445 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:13:27.929107 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:13:27.929079 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:13:27.933058 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:13:27.933041 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:13:27.936293 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:13:27.936277 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:18:27.954649 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:18:27.954547 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:18:27.958580 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:18:27.958401 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:18:27.961861 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:18:27.961843 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:18:27.966867 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:18:27.966848 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:19:12.366154 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.366075 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8ccpf/must-gather-6gxgf"] Apr 16 23:19:12.369415 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.369400 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8ccpf/must-gather-6gxgf" Apr 16 23:19:12.372018 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.371993 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8ccpf\"/\"default-dockercfg-hkh9t\"" Apr 16 23:19:12.372127 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.371999 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8ccpf\"/\"kube-root-ca.crt\"" Apr 16 23:19:12.372127 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.372002 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8ccpf\"/\"openshift-service-ca.crt\"" Apr 16 23:19:12.376822 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.376794 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8ccpf/must-gather-6gxgf"] Apr 16 23:19:12.472436 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.472411 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c552b-ed29-4194-94c9-31da1aeeaff4-must-gather-output\") pod \"must-gather-6gxgf\" (UID: \"130c552b-ed29-4194-94c9-31da1aeeaff4\") " pod="openshift-must-gather-8ccpf/must-gather-6gxgf" Apr 16 23:19:12.472572 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.472462 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2bfb\" (UniqueName: \"kubernetes.io/projected/130c552b-ed29-4194-94c9-31da1aeeaff4-kube-api-access-d2bfb\") pod \"must-gather-6gxgf\" (UID: \"130c552b-ed29-4194-94c9-31da1aeeaff4\") " pod="openshift-must-gather-8ccpf/must-gather-6gxgf" Apr 16 23:19:12.573872 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.573843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c552b-ed29-4194-94c9-31da1aeeaff4-must-gather-output\") pod \"must-gather-6gxgf\" (UID: \"130c552b-ed29-4194-94c9-31da1aeeaff4\") " pod="openshift-must-gather-8ccpf/must-gather-6gxgf" Apr 16 23:19:12.573982 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.573882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2bfb\" (UniqueName: \"kubernetes.io/projected/130c552b-ed29-4194-94c9-31da1aeeaff4-kube-api-access-d2bfb\") pod \"must-gather-6gxgf\" (UID: \"130c552b-ed29-4194-94c9-31da1aeeaff4\") " pod="openshift-must-gather-8ccpf/must-gather-6gxgf" Apr 16 23:19:12.574169 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.574152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c552b-ed29-4194-94c9-31da1aeeaff4-must-gather-output\") pod \"must-gather-6gxgf\" (UID: \"130c552b-ed29-4194-94c9-31da1aeeaff4\") " pod="openshift-must-gather-8ccpf/must-gather-6gxgf" Apr 16 23:19:12.582181 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.582156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2bfb\" (UniqueName: \"kubernetes.io/projected/130c552b-ed29-4194-94c9-31da1aeeaff4-kube-api-access-d2bfb\") pod \"must-gather-6gxgf\" (UID: \"130c552b-ed29-4194-94c9-31da1aeeaff4\") " pod="openshift-must-gather-8ccpf/must-gather-6gxgf" Apr 16 23:19:12.692994 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.692933 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8ccpf/must-gather-6gxgf" Apr 16 23:19:12.814056 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.814032 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8ccpf/must-gather-6gxgf"] Apr 16 23:19:12.816059 ip-10-0-136-39 kubenswrapper[2579]: W0416 23:19:12.816017 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130c552b_ed29_4194_94c9_31da1aeeaff4.slice/crio-45c4a16fbde19902eda0a170374c5f5d915cdfe6b5cf9d3af8fdda842c3a50cc WatchSource:0}: Error finding container 45c4a16fbde19902eda0a170374c5f5d915cdfe6b5cf9d3af8fdda842c3a50cc: Status 404 returned error can't find the container with id 45c4a16fbde19902eda0a170374c5f5d915cdfe6b5cf9d3af8fdda842c3a50cc Apr 16 23:19:12.817639 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:12.817623 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:19:13.171067 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:13.171035 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/must-gather-6gxgf" event={"ID":"130c552b-ed29-4194-94c9-31da1aeeaff4","Type":"ContainerStarted","Data":"45c4a16fbde19902eda0a170374c5f5d915cdfe6b5cf9d3af8fdda842c3a50cc"} Apr 16 23:19:14.177090 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:14.177046 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/must-gather-6gxgf" event={"ID":"130c552b-ed29-4194-94c9-31da1aeeaff4","Type":"ContainerStarted","Data":"af52d8bd82f6211d7b2f64894a09a0635cc3daf90935852b538973f583ae6e07"} Apr 16 23:19:14.177090 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:14.177093 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/must-gather-6gxgf" event={"ID":"130c552b-ed29-4194-94c9-31da1aeeaff4","Type":"ContainerStarted","Data":"36d47b91469eb6f3c76a008961f70e08ca2bb4bd408cebfa9af8aa5e0175f563"} Apr 16 23:19:14.196358 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:14.196286 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8ccpf/must-gather-6gxgf" podStartSLOduration=1.314514033 podStartE2EDuration="2.196271577s" podCreationTimestamp="2026-04-16 23:19:12 +0000 UTC" firstStartedPulling="2026-04-16 23:19:12.817767219 +0000 UTC m=+3945.663536109" lastFinishedPulling="2026-04-16 23:19:13.69952476 +0000 UTC m=+3946.545293653" observedRunningTime="2026-04-16 23:19:14.191574987 +0000 UTC m=+3947.037343904" watchObservedRunningTime="2026-04-16 23:19:14.196271577 +0000 UTC m=+3947.042040510" Apr 16 23:19:15.137228 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:15.137196 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wvd8j_c3e33b82-1d36-4e38-ae60-42189b25da6e/global-pull-secret-syncer/0.log" Apr 16 23:19:15.191869 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:15.191838 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fjlpn_57d6ffd8-4f1e-4169-adf4-276bac26da26/konnectivity-agent/0.log" Apr 16 23:19:15.325173 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:15.325144 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-39.ec2.internal_d26c4d7ef73b8123da4c66a608d9e63b/haproxy/0.log" Apr 16 23:19:19.060462 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.060435 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a98cab9-f502-4096-b3bd-6cb5d95b5cbd/alertmanager/0.log" Apr 16 23:19:19.089173 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.089141 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a98cab9-f502-4096-b3bd-6cb5d95b5cbd/config-reloader/0.log" Apr 16 23:19:19.117877 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.117853 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a98cab9-f502-4096-b3bd-6cb5d95b5cbd/kube-rbac-proxy-web/0.log" Apr 16 23:19:19.145560 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.145531 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a98cab9-f502-4096-b3bd-6cb5d95b5cbd/kube-rbac-proxy/0.log" Apr 16 23:19:19.173534 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.173508 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a98cab9-f502-4096-b3bd-6cb5d95b5cbd/kube-rbac-proxy-metric/0.log" Apr 16 23:19:19.198505 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.198466 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a98cab9-f502-4096-b3bd-6cb5d95b5cbd/prom-label-proxy/0.log" Apr 16 23:19:19.225658 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.225619 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a98cab9-f502-4096-b3bd-6cb5d95b5cbd/init-config-reloader/0.log" Apr 16 23:19:19.279540 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.279512 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-5c4w2_a242c550-213a-4a82-8bbb-01a37bbc13c5/cluster-monitoring-operator/0.log" Apr 16 23:19:19.301369 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.301339 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xwxff_47cb66de-e3fc-492e-a19a-947935e77c6d/kube-state-metrics/0.log" Apr 16 23:19:19.326793 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.326699 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xwxff_47cb66de-e3fc-492e-a19a-947935e77c6d/kube-rbac-proxy-main/0.log" Apr 16 23:19:19.352174 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.352146 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xwxff_47cb66de-e3fc-492e-a19a-947935e77c6d/kube-rbac-proxy-self/0.log" Apr 16 23:19:19.418103 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.418067 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-hkct8_54352985-5c56-411a-9cc0-f1fb17b7e0ab/monitoring-plugin/0.log" Apr 16 23:19:19.451692 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.451664 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-64xk5_33bca243-d299-49da-a48f-9f0396daab4e/node-exporter/0.log" Apr 16 23:19:19.478658 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.478628 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-64xk5_33bca243-d299-49da-a48f-9f0396daab4e/kube-rbac-proxy/0.log" Apr 16 23:19:19.502125 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.502098 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-64xk5_33bca243-d299-49da-a48f-9f0396daab4e/init-textfile/0.log" Apr 16 23:19:19.707614 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.707526 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m6k6x_ad29900d-bb7e-4e80-87db-950ad0387eff/kube-rbac-proxy-main/0.log" Apr 16 23:19:19.741186 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.741155 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m6k6x_ad29900d-bb7e-4e80-87db-950ad0387eff/kube-rbac-proxy-self/0.log" Apr 16 23:19:19.771952 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:19.770907 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m6k6x_ad29900d-bb7e-4e80-87db-950ad0387eff/openshift-state-metrics/0.log" Apr 16 23:19:20.071425 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.071392 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hdjfm_45c088ad-918e-444c-8625-7a0f758483d7/prometheus-operator-admission-webhook/0.log" Apr 16 23:19:20.101261 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.101195 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85c96466d6-vt5bm_f2484156-d744-43aa-8be7-f03249b65d82/telemeter-client/0.log" Apr 16 23:19:20.123605 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.123582 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85c96466d6-vt5bm_f2484156-d744-43aa-8be7-f03249b65d82/reload/0.log" Apr 16 23:19:20.146992 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.146961 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85c96466d6-vt5bm_f2484156-d744-43aa-8be7-f03249b65d82/kube-rbac-proxy/0.log" Apr 16 23:19:20.209742 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.209702 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9d8fbcf6f-pxjb6_c47807c1-3916-46cb-8aeb-41f3b3e23fd9/thanos-query/0.log" Apr 16 23:19:20.240824 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.240787 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9d8fbcf6f-pxjb6_c47807c1-3916-46cb-8aeb-41f3b3e23fd9/kube-rbac-proxy-web/0.log" Apr 16 23:19:20.266266 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.266223 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9d8fbcf6f-pxjb6_c47807c1-3916-46cb-8aeb-41f3b3e23fd9/kube-rbac-proxy/0.log" Apr 16 23:19:20.290515 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.290474 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9d8fbcf6f-pxjb6_c47807c1-3916-46cb-8aeb-41f3b3e23fd9/prom-label-proxy/0.log" Apr 16 23:19:20.314572 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.314547 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9d8fbcf6f-pxjb6_c47807c1-3916-46cb-8aeb-41f3b3e23fd9/kube-rbac-proxy-rules/0.log" Apr 16 23:19:20.337588 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:20.337516 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9d8fbcf6f-pxjb6_c47807c1-3916-46cb-8aeb-41f3b3e23fd9/kube-rbac-proxy-metrics/0.log" Apr 16 23:19:21.217786 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:21.217706 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-6z6fs_35b5e138-1f3a-4054-9071-6e12676b9b25/networking-console-plugin/0.log" Apr 16 23:19:21.574331 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:21.574305 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/1.log" Apr 16 23:19:21.579661 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:21.579640 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s2fv5_42c17f94-3e64-4703-a384-9593c55d048f/console-operator/2.log" Apr 16 23:19:21.931179 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:21.931111 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f459c98f5-q44cn_c199ff16-213e-4848-bf36-552d5c622b1e/console/0.log" Apr 16 23:19:21.956896 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:21.956868 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn"] Apr 16 23:19:21.962173 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:21.962149 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:21.968810 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:21.968782 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn"] Apr 16 23:19:21.982595 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:21.982572 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lbd7f_e2010ec2-eb19-4021-a065-02a91f2ca7ee/download-server/0.log" Apr 16 23:19:22.069022 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.068991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-lib-modules\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.069022 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.069027 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5rlb\" (UniqueName: \"kubernetes.io/projected/47a191a9-c603-4d77-a725-cf331fa76920-kube-api-access-m5rlb\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.069229 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.069114 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-podres\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.069229 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.069175 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-sys\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.069305 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.069266 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-proc\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.170593 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170552 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-podres\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.170797 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-sys\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.170797 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170763 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-podres\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.170906 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-proc\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.170906 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-sys\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.170906 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-lib-modules\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.170906 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5rlb\" (UniqueName: \"kubernetes.io/projected/47a191a9-c603-4d77-a725-cf331fa76920-kube-api-access-m5rlb\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.171087 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170909 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-proc\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.171087 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.170929 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47a191a9-c603-4d77-a725-cf331fa76920-lib-modules\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.178751 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.178710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5rlb\" (UniqueName: \"kubernetes.io/projected/47a191a9-c603-4d77-a725-cf331fa76920-kube-api-access-m5rlb\") pod \"perf-node-gather-daemonset-tc6pn\" (UID: \"47a191a9-c603-4d77-a725-cf331fa76920\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.274646 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.274613 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:22.385158 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.385132 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-rwd92_1326ec62-5db0-4705-a851-056172e81fd1/volume-data-source-validator/0.log" Apr 16 23:19:22.415683 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:22.415658 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn"] Apr 16 23:19:23.078529 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:23.078500 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rpbsg_7b7fa80a-7e5b-4b14-8792-ff01bd1f2143/dns/0.log" Apr 16 23:19:23.098656 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:23.098617 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rpbsg_7b7fa80a-7e5b-4b14-8792-ff01bd1f2143/kube-rbac-proxy/0.log" Apr 16 23:19:23.190191 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:23.190164 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-msw8q_e7417e8e-90d4-47a8-926c-b10f15f3a850/dns-node-resolver/0.log" Apr 16 23:19:23.219805 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:23.219761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" event={"ID":"47a191a9-c603-4d77-a725-cf331fa76920","Type":"ContainerStarted","Data":"f28a1671bcf6d9e60d3678318aea1cbfdd7f2727fe8e499dca9c1f53906d8d8c"} Apr 16 23:19:23.219805 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:23.219808 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" event={"ID":"47a191a9-c603-4d77-a725-cf331fa76920","Type":"ContainerStarted","Data":"150b1de9375692c1021ed45c9842dcc34fed3a7a1b71da7b50883a5486ef2826"} Apr 16 23:19:23.220021 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:23.219887 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:23.237091 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:23.237044 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" podStartSLOduration=2.237029435 podStartE2EDuration="2.237029435s" podCreationTimestamp="2026-04-16 23:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:19:23.234589237 +0000 UTC m=+3956.080358149" watchObservedRunningTime="2026-04-16 23:19:23.237029435 +0000 UTC m=+3956.082798346" Apr 16 23:19:23.609672 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:23.609645 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flbzq_598903e0-6d7b-4392-b685-da66c0408923/node-ca/0.log" Apr 16 23:19:24.696105 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:24.696078 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sfffv_ed1be888-8420-4861-992a-ffd27fc02a14/serve-healthcheck-canary/0.log" Apr 16 23:19:25.015585 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:25.015557 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-87gkq_7db6d6a8-5304-4b41-87c9-a4f433031f6e/insights-operator/0.log" Apr 16 23:19:25.016355 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:25.016337 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-87gkq_7db6d6a8-5304-4b41-87c9-a4f433031f6e/insights-operator/1.log" Apr 16 23:19:25.035198 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:25.035181 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8wkch_39788394-ead2-4c74-8022-fedd8f2c6a08/kube-rbac-proxy/0.log" Apr 16 23:19:25.054451 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:25.054432 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8wkch_39788394-ead2-4c74-8022-fedd8f2c6a08/exporter/0.log" Apr 16 23:19:25.073996 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:25.073977 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8wkch_39788394-ead2-4c74-8022-fedd8f2c6a08/extractor/0.log" Apr 16 23:19:27.104672 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:27.104640 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-2t5f6_85b604ac-4812-41d0-b537-df9b5f259f6a/manager/0.log" Apr 16 23:19:27.370500 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:27.370414 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-tmw2j_6bfd6dbc-43b3-4753-a985-31d5fd269dd0/manager/0.log" Apr 16 23:19:27.388642 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:27.388610 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-lr6f4_3909df36-8a3d-4e72-a2f3-8a59a16ff652/s3-init/0.log" Apr 16 23:19:27.410241 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:27.410225 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-bgx52_496c0a23-ddfc-4020-bcc7-0894a11db16c/s3-tls-init-custom/0.log" Apr 16 23:19:27.431412 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:27.431386 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-zjmhj_dedb1bb9-2004-41ae-b7b2-adfe73e6914a/s3-tls-init-serving/0.log" Apr 16 23:19:29.235669 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:29.235638 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-tc6pn" Apr 16 23:19:32.604557 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:32.604524 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5c5ns_e3a1ce43-bbf9-45df-abbd-7ec6821f991b/kube-multus/0.log" Apr 16 23:19:32.776779 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:32.776754 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knsc9_bcee1191-76cc-4c41-ad57-b41d75589f20/kube-multus-additional-cni-plugins/0.log" Apr 16 23:19:32.799539 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:32.799519 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knsc9_bcee1191-76cc-4c41-ad57-b41d75589f20/egress-router-binary-copy/0.log" Apr 16 23:19:32.821564 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:32.821545 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knsc9_bcee1191-76cc-4c41-ad57-b41d75589f20/cni-plugins/0.log" Apr 16 23:19:32.841131 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:32.841110 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knsc9_bcee1191-76cc-4c41-ad57-b41d75589f20/bond-cni-plugin/0.log" Apr 16 23:19:32.861590 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:32.861508 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knsc9_bcee1191-76cc-4c41-ad57-b41d75589f20/routeoverride-cni/0.log" Apr 16 23:19:32.882353 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:32.882323 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knsc9_bcee1191-76cc-4c41-ad57-b41d75589f20/whereabouts-cni-bincopy/0.log" Apr 16 23:19:32.905684 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:32.905661 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knsc9_bcee1191-76cc-4c41-ad57-b41d75589f20/whereabouts-cni/0.log" Apr 16 23:19:33.287716 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:33.287687 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qgfjd_a2d7d39e-d19f-4a6e-8107-593903f29181/network-metrics-daemon/0.log" Apr 16 23:19:33.309475 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:33.309450 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qgfjd_a2d7d39e-d19f-4a6e-8107-593903f29181/kube-rbac-proxy/0.log" Apr 16 23:19:34.512618 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.512589 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-controller/0.log" Apr 16 23:19:34.531278 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.531252 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/0.log" Apr 16 23:19:34.556671 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.556643 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovn-acl-logging/1.log" Apr 16 23:19:34.575777 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.575756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/kube-rbac-proxy-node/0.log" Apr 16 23:19:34.594423 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.594400 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 23:19:34.611604 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.611582 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/northd/0.log" Apr 16 23:19:34.630356 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.630339 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/nbdb/0.log" Apr 16 23:19:34.655943 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.655926 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/sbdb/0.log" Apr 16 23:19:34.767136 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:34.767061 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z4npz_3894fa51-bc91-4390-ab13-ef051552e33a/ovnkube-controller/0.log" Apr 16 23:19:35.716988 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:35.716958 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6978g_e531ad1d-2d55-48e3-afc2-f5404821539c/network-check-target-container/0.log" Apr 16 23:19:36.577537 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:36.577503 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8r8rq_c65407a6-d8b9-47f5-ac3f-231ddd09de73/iptables-alerter/0.log" Apr 16 23:19:37.167324 ip-10-0-136-39 kubenswrapper[2579]: I0416 23:19:37.167297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8x5qm_dae5bcc7-87a6-403b-ab23-dc1fc36c0615/tuned/0.log"