Apr 17 11:13:29.913633 ip-10-0-133-230 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:13:29.913642 ip-10-0-133-230 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:13:29.913649 ip-10-0-133-230 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:13:29.913871 ip-10-0-133-230 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:13:41.211338 ip-10-0-133-230 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:13:41.211353 ip-10-0-133-230 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3176f904870a4beeb932c04d99ddf1a1 -- Apr 17 11:16:04.295719 ip-10-0-133-230 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:04.756413 ip-10-0-133-230 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:04.756413 ip-10-0-133-230 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:04.756413 ip-10-0-133-230 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:04.756413 ip-10-0-133-230 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:04.756413 ip-10-0-133-230 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:04.758680 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.758579 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:04.763080 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763060 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:04.763080 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763079 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763083 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763087 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763091 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763094 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763097 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763100 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763103 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763106 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763111 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763114 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763123 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763127 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763129 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763132 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763135 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763138 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763140 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763144 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:04.763150 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763147 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763150 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763152 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763155 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763158 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763165 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763168 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763171 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763173 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763176 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763179 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763181 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763184 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763187 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763189 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763192 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763195 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763198 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763200 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763203 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:04.763630 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763205 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763208 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763210 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763213 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763215 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763217 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763220 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763222 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763225 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763227 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763230 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763232 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763235 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763238 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763242 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763245 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763247 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763250 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763253 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763255 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:04.764099 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763258 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763260 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763263 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763267 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763271 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763273 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763276 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763279 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763282 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763284 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763288 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763290 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763293 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763295 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763298 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763300 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763303 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763305 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763308 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:04.764600 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763311 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763314 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763317 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763319 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763321 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763324 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.763326 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764853 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764862 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764866 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764868 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764871 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764874 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764878 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764880 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764883 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764885 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764888 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764891 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764894 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:04.765047 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764897 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764899 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764904 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764909 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764913 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764917 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764920 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764923 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764926 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764929 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764932 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764934 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764937 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764939 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764942 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764945 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764947 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764950 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764952 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:04.765561 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764956 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764959 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764962 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764965 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764968 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764970 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764973 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764976 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764978 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764981 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764983 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764986 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764989 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764991 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764994 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764996 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.764999 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765002 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765005 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765007 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:04.766072 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765009 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765012 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765015 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765017 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765020 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765022 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765024 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765027 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765029 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765032 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765035 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765037 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765041 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765043 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765046 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765049 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765052 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765054 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765057 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765060 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:04.766587 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765063 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765065 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765068 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765070 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765073 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765076 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765079 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765081 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765084 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765086 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765089 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765092 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765094 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765097 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765173 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765182 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765193 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765198 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765202 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765206 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765213 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:04.767078 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765218 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765221 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765224 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765229 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765233 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765237 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765240 2579 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765243 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765246 2579 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765249 2579 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765252 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765256 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765260 2579 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765263 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765266 2579 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765269 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765272 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765277 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765280 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765284 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765287 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765291 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765294 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765298 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765301 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:04.767603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765304 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765308 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765311 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765314 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765317 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765321 2579 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765324 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765328 2579 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765332 2579 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765336 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765339 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765343 2579 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765347 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765362 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765366 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765369 2579 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765372 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765375 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765378 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765381 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765384 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765387 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765390 2579 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765393 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765396 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:04.768204 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765399 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765403 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765406 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765409 2579 flags.go:64] FLAG: --help="false" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765412 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-133-230.ec2.internal" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765416 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765419 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765422 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765426 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765429 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765432 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765435 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765438 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765441 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765444 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765447 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765450 2579 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765454 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765457 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765460 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765463 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765466 2579 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765469 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765472 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:04.768825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765476 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765481 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765484 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765487 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765490 2579 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765493 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765497 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765500 2579 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765503 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765507 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765511 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765515 2579 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765518 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765521 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765525 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765528 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765531 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765534 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765537 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765545 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765548 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765551 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765554 2579 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:04.769430 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765557 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765563 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765566 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765570 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765574 2579 flags.go:64] FLAG: --port="10250" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765577 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765580 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-042651b4dc021f634" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765583 2579 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765586 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765589 2579 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765592 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765595 2579 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765599 2579 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765602 2579 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765605 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765608 2579 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765612 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765615 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765618 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765622 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765625 2579 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765628 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765631 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765634 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765637 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765640 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:04.769984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765644 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765647 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765650 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765653 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765656 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765659 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765662 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765665 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765668 2579 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765671 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765677 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765680 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765683 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765687 2579 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765691 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765694 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765697 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765700 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765703 2579 flags.go:64] FLAG: --v="2" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765708 2579 flags.go:64] FLAG: --version="false" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765713 2579 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765717 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.765721 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765821 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:04.770665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765825 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765828 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765831 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765833 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765836 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765840 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765844 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765847 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765850 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765853 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765856 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765858 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765861 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765864 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765866 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765872 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765875 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765878 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765880 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:04.771335 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765883 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765885 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765888 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765891 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765893 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765896 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765899 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765902 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765904 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765907 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765910 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765913 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765915 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765918 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765920 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765923 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765925 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765928 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765930 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765933 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:04.771879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765936 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765941 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765943 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765946 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765949 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765951 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765954 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765957 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765961 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765963 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765969 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765972 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765974 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765977 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765980 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765983 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765986 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765988 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765991 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765994 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:04.772436 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765996 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.765999 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766001 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766004 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766007 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766009 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766012 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766014 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766017 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766020 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766022 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766025 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766027 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766030 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766033 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766036 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766038 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766041 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766043 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766046 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:04.772942 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766054 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:04.773447 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766056 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:04.773447 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766061 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:04.773447 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766064 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:04.773447 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766067 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:04.773447 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.766069 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:04.773447 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.766780 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:04.773447 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.773405 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:04.773447 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.773426 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773477 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773483 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773487 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773490 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773492 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773495 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773498 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773502 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773505 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773508 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773511 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773513 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773516 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773519 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773521 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773524 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773527 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773530 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773533 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:04.773665 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773535 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773538 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773541 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773543 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773546 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773549 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773551 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773554 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773556 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773559 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773561 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773566 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773571 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773574 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773580 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773583 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773586 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773589 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773592 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773595 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:04.774149 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773598 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773601 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773603 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773606 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773608 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773611 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773614 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773617 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773619 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773623 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773626 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773630 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773633 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773635 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773638 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773641 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773644 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773646 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773649 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:04.774750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773652 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773654 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773657 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773659 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773662 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773665 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773668 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773672 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773675 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773677 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773680 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773682 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773685 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773688 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773690 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773693 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773695 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773698 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773700 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773703 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:04.775214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773705 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773708 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773710 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773713 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773715 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773718 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773720 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773723 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.773728 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773835 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773840 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773843 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773846 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773849 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773852 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773855 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:04.775743 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773858 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773861 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773865 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773868 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773871 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773874 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773877 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773879 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773882 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773885 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773887 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773890 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773893 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773896 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773899 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773901 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773904 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773906 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773910 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:04.776154 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773914 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773917 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773919 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773922 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773924 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773927 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773930 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773933 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773935 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773938 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773940 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773943 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773945 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773948 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773951 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773954 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773957 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773960 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773963 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773965 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:04.776640 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773968 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773971 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773973 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773976 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773978 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773981 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773984 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773986 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773989 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773991 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773994 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773996 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.773999 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774001 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774004 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774006 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774009 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774012 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774014 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:04.777129 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774017 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774019 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774023 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774025 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774027 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774030 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774033 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774035 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774038 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774041 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774044 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774048 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774051 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774053 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774056 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774059 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774061 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774064 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774066 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:04.777635 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774070 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:04.778139 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:04.774074 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:04.778139 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.774079 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:04.778139 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.774761 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:04.778139 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.777591 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:04.778611 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.778595 2579 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:04.778722 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.778701 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:04.779082 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.779071 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:04.807172 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.807148 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:04.809598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.809570 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:04.827552 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.827521 2579 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:04.834266 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.834239 2579 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:04.836380 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.836343 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:04.838730 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.838710 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:04.841146 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.841126 2579 fs.go:135] Filesystem UUIDs: map[5a474988-d61c-45cd-9489-dd5c4e39fcee:/dev/nvme0n1p4 7791856b-7de3-4ddd-87ae-f72e074d8c1c:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 11:16:04.841203 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.841147 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:04.847153 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.847037 2579 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:04.844870247 +0000 UTC m=+0.424000985 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100656 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2768b1e3438e5092a733efa607189d SystemUUID:ec2768b1-e343-8e50-92a7-33efa607189d BootID:3176f904-870a-4bee-b932-c04d99ddf1a1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:43:c8:d2:19:1b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:43:c8:d2:19:1b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:8e:17:0b:95:ee Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:04.847850 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.847839 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:04.847949 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.847936 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:04.849093 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.849065 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:04.849238 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.849096 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-230.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:04.849286 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.849248 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:04.849286 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.849257 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:04.849286 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.849270 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:04.849286 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.849284 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:04.850261 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.850250 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:04.850387 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.850378 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:04.852637 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.852625 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:04.852687 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.852646 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:04.852687 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.852663 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:04.852687 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.852673 2579 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:04.852687 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.852684 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:04.853755 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.853741 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:04.853798 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.853761 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:04.857300 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.857280 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:04.858795 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.858780 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:04.861043 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861030 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:04.861083 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861048 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:04.861083 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861054 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:04.861083 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861060 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:04.861083 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861066 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:04.861083 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861072 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:04.861083 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861078 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:04.861239 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861087 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:04.861239 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861094 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:04.861239 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861100 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:04.861239 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861119 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:04.861239 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861128 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:04.861807 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861797 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:04.861807 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.861807 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:04.869199 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.869170 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-230.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:04.869199 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.869189 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:04.869199 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.869178 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-230.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:04.869734 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.869722 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:04.869780 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.869772 2579 server.go:1295] "Started kubelet" Apr 17 11:16:04.869955 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.869871 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:04.870094 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.869930 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:04.870094 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.870006 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:04.870813 ip-10-0-133-230 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:04.871441 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.871208 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:04.872720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.872707 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:04.876606 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.875669 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-230.ec2.internal.18a720bad756a1f9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-230.ec2.internal,UID:ip-10-0-133-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-230.ec2.internal,},FirstTimestamp:2026-04-17 11:16:04.869734905 +0000 UTC m=+0.448865643,LastTimestamp:2026-04-17 11:16:04.869734905 +0000 UTC m=+0.448865643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-230.ec2.internal,}" Apr 17 11:16:04.877003 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.876988 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:04.877067 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.877003 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:04.877721 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.877693 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:04.877721 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.877694 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:04.877881 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.877729 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:04.877881 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.877865 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:04.877881 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.877873 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:04.878014 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.877904 2579 factory.go:55] Registering systemd factory Apr 17 11:16:04.878014 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.877975 2579 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:04.878241 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.878225 2579 factory.go:153] Registering CRI-O factory Apr 17 11:16:04.878241 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.878241 2579 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:04.878344 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.878305 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:04.878344 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.878334 2579 factory.go:103] Registering Raw factory Apr 17 11:16:04.878468 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.878374 2579 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:04.878798 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.878770 2579 manager.go:319] Starting recovery of all containers Apr 17 11:16:04.879383 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.879333 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:04.879721 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.879697 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:04.887629 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.887388 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:04.887781 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.887433 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mhqxd" Apr 17 11:16:04.889835 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.889811 2579 manager.go:324] Recovery completed Apr 17 11:16:04.890788 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.890754 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-230.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 11:16:04.891431 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.891386 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 11:16:04.893730 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.893554 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 11:16:04.895592 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.895568 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mhqxd" Apr 17 11:16:04.897812 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.897797 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:04.900717 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.900698 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:04.900825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.900730 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:04.900825 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.900740 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:04.901203 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.901190 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:04.901203 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.901200 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:04.901295 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.901218 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:04.903255 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.903175 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-230.ec2.internal.18a720bad92f5a15 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-230.ec2.internal,UID:ip-10-0-133-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-230.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-230.ec2.internal,},FirstTimestamp:2026-04-17 11:16:04.900715029 +0000 UTC m=+0.479845767,LastTimestamp:2026-04-17 11:16:04.900715029 +0000 UTC m=+0.479845767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-230.ec2.internal,}" Apr 17 11:16:04.904501 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.904481 2579 policy_none.go:49] "None policy: Start" Apr 17 11:16:04.904501 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.904502 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:04.904588 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.904513 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:04.961115 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.961098 2579 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:04.961218 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.961136 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:04.961218 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.961147 2579 server.go:85] "Starting device plugin registration server" Apr 17 11:16:04.961517 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.961504 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:04.961569 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.961520 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:04.961644 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.961622 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:04.961753 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.961735 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:04.961753 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.961753 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:04.962258 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.962235 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:04.962362 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.962275 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:04.979297 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.979261 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:04.979297 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.979301 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:04.979536 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.979326 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:04.979536 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.979335 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:04.979536 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:04.979389 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:04.984132 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:04.984111 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:05.062478 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.062407 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:05.063681 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.063662 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:05.063760 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.063696 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:05.063760 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.063709 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:05.063760 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.063739 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.073018 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.072990 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.073169 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.073025 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-230.ec2.internal\": node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.079839 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.079806 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal"] Apr 17 11:16:05.079927 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.079889 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:05.081641 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.081621 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:05.081746 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.081654 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:05.081746 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.081668 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:05.083787 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.083764 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:05.083908 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.083888 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.083976 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.083932 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:05.084698 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.084681 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:05.084771 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.084715 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:05.084771 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.084726 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:05.084771 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.084681 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:05.084862 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.084799 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:05.084862 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.084817 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:05.086703 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.086689 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.086798 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.086715 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:05.087633 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.087613 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:05.087721 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.087643 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:05.087721 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.087655 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:05.090046 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.090028 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.113325 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.113300 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-230.ec2.internal\" not found" node="ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.116882 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.116861 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-230.ec2.internal\" not found" node="ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.179625 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.179582 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53ca2259f548b65113e09992cb274ca5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal\" (UID: \"53ca2259f548b65113e09992cb274ca5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.179625 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.179622 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ca2259f548b65113e09992cb274ca5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal\" (UID: \"53ca2259f548b65113e09992cb274ca5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.179835 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.179647 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef42997bfb50561dbaa710ba959617d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-230.ec2.internal\" (UID: \"0ef42997bfb50561dbaa710ba959617d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.190126 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.190100 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.280770 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.280737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53ca2259f548b65113e09992cb274ca5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal\" (UID: \"53ca2259f548b65113e09992cb274ca5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.280889 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.280779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ca2259f548b65113e09992cb274ca5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal\" (UID: \"53ca2259f548b65113e09992cb274ca5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.280889 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.280808 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef42997bfb50561dbaa710ba959617d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-230.ec2.internal\" (UID: \"0ef42997bfb50561dbaa710ba959617d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.280889 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.280857 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef42997bfb50561dbaa710ba959617d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-230.ec2.internal\" (UID: \"0ef42997bfb50561dbaa710ba959617d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.280889 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.280866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53ca2259f548b65113e09992cb274ca5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal\" (UID: \"53ca2259f548b65113e09992cb274ca5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.281020 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.280860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ca2259f548b65113e09992cb274ca5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal\" (UID: \"53ca2259f548b65113e09992cb274ca5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.290843 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.290813 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.391800 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.391736 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.415952 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.415919 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.419657 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.419633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" Apr 17 11:16:05.492754 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.492711 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.593266 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.593230 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.693954 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.693873 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.779214 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.779172 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:05.779883 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.779375 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:05.794575 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.794495 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.856231 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.856197 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:05.877178 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.877154 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:05.887845 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.887809 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:05.895336 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:05.895311 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-230.ec2.internal\" not found" Apr 17 11:16:05.898506 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.898462 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:04 +0000 UTC" deadline="2027-12-27 06:05:53.526659822 +0000 UTC" Apr 17 11:16:05.898506 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.898504 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14850h49m47.628160583s" Apr 17 11:16:05.931855 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.931817 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tnt9t" Apr 17 11:16:05.939647 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.939623 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tnt9t" Apr 17 11:16:05.990346 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:05.990312 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ca2259f548b65113e09992cb274ca5.slice/crio-fcbe63c666b1ff5f2e94331b910dc904061c8eb709fa1b992b7e19114f83f5ca WatchSource:0}: Error finding container fcbe63c666b1ff5f2e94331b910dc904061c8eb709fa1b992b7e19114f83f5ca: Status 404 returned error can't find the container with id fcbe63c666b1ff5f2e94331b910dc904061c8eb709fa1b992b7e19114f83f5ca Apr 17 11:16:05.990516 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.990407 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:05.995953 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:05.995934 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:06.000762 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:06.000729 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef42997bfb50561dbaa710ba959617d.slice/crio-3faa3baf48e03605563f93d93d762ad6c3c3adb012959d3266a1082a8d49c6fc WatchSource:0}: Error finding container 3faa3baf48e03605563f93d93d762ad6c3c3adb012959d3266a1082a8d49c6fc: Status 404 returned error can't find the container with id 3faa3baf48e03605563f93d93d762ad6c3c3adb012959d3266a1082a8d49c6fc Apr 17 11:16:06.077774 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.077731 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" Apr 17 11:16:06.091916 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.091891 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:06.093035 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.093018 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" Apr 17 11:16:06.100964 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.100938 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:06.323135 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.323045 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:06.853847 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.853816 2579 apiserver.go:52] "Watching apiserver" Apr 17 11:16:06.861203 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.861171 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:06.861613 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.861585 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-szsmw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal","openshift-multus/multus-additional-cni-plugins-6gg86","openshift-network-diagnostics/network-check-target-cxlg6","openshift-ovn-kubernetes/ovnkube-node-p8dz7","kube-system/konnectivity-agent-mhcgl","kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw","openshift-cluster-node-tuning-operator/tuned-c9tz8","openshift-dns/node-resolver-w2vns","openshift-multus/multus-dwf52","openshift-multus/network-metrics-daemon-n2dw9","openshift-network-operator/iptables-alerter-n8ccr"] Apr 17 11:16:06.864235 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.864207 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:06.866648 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.866466 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s9wql\"" Apr 17 11:16:06.866648 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.866470 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:06.866648 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.866562 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:06.868939 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.868910 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.869192 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.869168 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:06.869328 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:06.869290 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:06.870726 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.870684 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:06.870841 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.870746 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-77pwz\"" Apr 17 11:16:06.870841 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.870800 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:06.870841 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.870817 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:06.871000 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.870683 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:06.871000 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.870922 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:06.871703 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.871525 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.873185 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.873161 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:06.873322 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.873299 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:06.873453 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.873436 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:06.873453 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.873443 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qqsxn\"" Apr 17 11:16:06.873597 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.873581 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:06.873852 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.873810 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:06.873955 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.873826 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:06.874224 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.874201 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:06.875794 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.875772 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:06.875924 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.875906 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9sgbr\"" Apr 17 11:16:06.876091 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.876071 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:06.876742 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.876719 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.878474 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.878456 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:06.878598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.878469 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:06.878598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.878566 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q7f9b\"" Apr 17 11:16:06.878707 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.878600 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:06.879252 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.879222 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.881327 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.880950 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:06.881327 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.881049 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mbtv5\"" Apr 17 11:16:06.881327 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.881318 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:06.881707 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.881686 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:06.883599 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.883578 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:06.883716 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.883672 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:06.883764 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.883713 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:06.884073 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.884056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.884721 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.884704 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9sgx7\"" Apr 17 11:16:06.885719 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.885698 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:06.885818 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.885765 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dptdg\"" Apr 17 11:16:06.886838 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.886818 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:06.886956 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:06.886932 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:06.889247 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.889225 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:06.890763 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.890705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysctl-d\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.890877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.890784 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysctl-conf\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.890877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.890817 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-var-lib-kubelet\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.890877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.890849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.891024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.890897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-systemd\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.891024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.890925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.891024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.890952 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgqcp\" (UniqueName: \"kubernetes.io/projected/282d56ec-09a3-4c2e-a098-e8271c1f2147-kube-api-access-pgqcp\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.891024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.890983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.891024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891013 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-node-log\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.891230 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovnkube-script-lib\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.891230 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891088 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.891230 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891145 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-tuned\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.891386 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqpw\" (UniqueName: \"kubernetes.io/projected/df30c3d2-0c10-4a19-94e7-a09f60737213-kube-api-access-rgqpw\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.891450 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891428 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-slash\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.891524 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-run-netns\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.891576 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysconfig\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.891625 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-kubernetes\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.891672 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891625 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-cnibin\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.891672 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-lib-modules\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.891756 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891711 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.891756 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-registration-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.891848 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-device-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.891848 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-modprobe-d\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.891937 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891859 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-systemd-units\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.891937 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891914 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-run\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.892028 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891945 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-socket-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.892028 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.891972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:06.892028 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-var-lib-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892156 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-env-overrides\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892156 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-sys\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.892247 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892186 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-system-cni-dir\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.892247 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-tmp-dir\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:06.892333 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-systemd\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892332 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-etc-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892456 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892408 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892503 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-cni-netd\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892553 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892512 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-cni-bin\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892599 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cjct\" (UniqueName: \"kubernetes.io/projected/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-kube-api-access-6cjct\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892653 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892604 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:06.892653 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892630 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:06.892765 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892630 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-host\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.892820 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892790 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.892871 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892833 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-hosts-file\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:06.892922 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892877 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-kubelet\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.892922 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovn-node-metrics-cert\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.893024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892920 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:06.893024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892947 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df30c3d2-0c10-4a19-94e7-a09f60737213-tmp\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.893024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.892979 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24jmp\" (UniqueName: \"kubernetes.io/projected/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-kube-api-access-24jmp\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:06.893024 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893009 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-ovn\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.893220 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.893220 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfrqf\" (UniqueName: \"kubernetes.io/projected/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-kube-api-access-dfrqf\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.893220 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893108 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovnkube-config\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.893220 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893141 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282-konnectivity-ca\") pod \"konnectivity-agent-mhcgl\" (UID: \"01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282\") " pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:06.893220 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893175 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-os-release\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.893220 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-sys-fs\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.893518 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.893518 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-log-socket\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.893518 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.893426 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282-agent-certs\") pod \"konnectivity-agent-mhcgl\" (UID: \"01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282\") " pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:06.894681 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.894656 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2flrb\"" Apr 17 11:16:06.940571 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.940514 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:05 +0000 UTC" deadline="2027-10-24 21:00:17.821563914 +0000 UTC" Apr 17 11:16:06.940571 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.940566 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13329h44m10.881002441s" Apr 17 11:16:06.979347 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.979317 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:06.985639 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.985586 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" event={"ID":"0ef42997bfb50561dbaa710ba959617d","Type":"ContainerStarted","Data":"3faa3baf48e03605563f93d93d762ad6c3c3adb012959d3266a1082a8d49c6fc"} Apr 17 11:16:06.986600 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.986574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" event={"ID":"53ca2259f548b65113e09992cb274ca5","Type":"ContainerStarted","Data":"fcbe63c666b1ff5f2e94331b910dc904061c8eb709fa1b992b7e19114f83f5ca"} Apr 17 11:16:06.993616 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993582 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-ovn\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.993616 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.993829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfrqf\" (UniqueName: \"kubernetes.io/projected/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-kube-api-access-dfrqf\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.993829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-ovn\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.993829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993721 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.993829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-cni-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.993829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993797 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovnkube-config\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.993829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993816 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282-konnectivity-ca\") pod \"konnectivity-agent-mhcgl\" (UID: \"01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282\") " pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993845 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-os-release\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993866 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-sys-fs\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993890 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-os-release\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-cni-bin\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993938 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02807a83-8fd7-42bf-a119-50f4017a2833-iptables-alerter-script\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-sys-fs\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.993985 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-log-socket\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-log-socket\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994030 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994032 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282-agent-certs\") pod \"konnectivity-agent-mhcgl\" (UID: \"01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282\") " pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysctl-d\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.994102 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysctl-conf\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-var-lib-kubelet\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-k8s-cni-cncf-io\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994209 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysctl-d\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-systemd\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-var-lib-kubelet\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994267 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-systemd\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysctl-conf\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgqcp\" (UniqueName: \"kubernetes.io/projected/282d56ec-09a3-4c2e-a098-e8271c1f2147-kube-api-access-pgqcp\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994345 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994394 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-node-log\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovnkube-script-lib\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994479 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/44c25c59-0494-490c-9cea-82d3c5d19215-multus-daemon-config\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.994754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994520 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994578 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-tuned\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovnkube-config\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqpw\" (UniqueName: \"kubernetes.io/projected/df30c3d2-0c10-4a19-94e7-a09f60737213-kube-api-access-rgqpw\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994655 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02807a83-8fd7-42bf-a119-50f4017a2833-host-slash\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj58m\" (UniqueName: \"kubernetes.io/projected/02807a83-8fd7-42bf-a119-50f4017a2833-kube-api-access-wj58m\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-slash\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-slash\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994818 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282-konnectivity-ca\") pod \"konnectivity-agent-mhcgl\" (UID: \"01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282\") " pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994685 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-node-log\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44c25c59-0494-490c-9cea-82d3c5d19215-cni-binary-copy\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994868 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-netns\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994879 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994894 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-run-netns\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994929 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-run-netns\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.995525 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysconfig\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-kubernetes\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.994992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-os-release\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995068 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-kubernetes\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovnkube-script-lib\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-cnibin\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995251 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-sysconfig\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995275 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-socket-dir-parent\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-cnibin\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-conf-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-multus-certs\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995408 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-lib-modules\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-registration-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-device-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e92a775-6760-473c-a919-b7e7bcf242c5-serviceca\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:06.996272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995549 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fs95\" (UniqueName: \"kubernetes.io/projected/6e92a775-6760-473c-a919-b7e7bcf242c5-kube-api-access-9fs95\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-etc-kubernetes\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-registration-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995599 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-modprobe-d\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-systemd-units\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-run\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e92a775-6760-473c-a919-b7e7bcf242c5-host\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-system-cni-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995701 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-systemd-units\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995711 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-socket-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995784 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-modprobe-d\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995842 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-run\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-device-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.995977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-socket-dir\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/282d56ec-09a3-4c2e-a098-e8271c1f2147-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996059 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-lib-modules\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-cnibin\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.996984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996297 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-var-lib-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-env-overrides\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-sys\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-system-cni-dir\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-tmp-dir\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-systemd\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-etc-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-cni-netd\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-kubelet\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996558 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-cni-bin\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cjct\" (UniqueName: \"kubernetes.io/projected/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-kube-api-access-6cjct\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-host\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-var-lib-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-etc-openvswitch\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.998775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-hosts-file\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-hostroot\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-system-cni-dir\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rlq6\" (UniqueName: \"kubernetes.io/projected/44c25c59-0494-490c-9cea-82d3c5d19215-kube-api-access-5rlq6\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996755 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tskdg\" (UniqueName: \"kubernetes.io/projected/00d87433-8bc9-4d18-bab8-e6f889a4b52d-kube-api-access-tskdg\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/282d56ec-09a3-4c2e-a098-e8271c1f2147-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996782 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-kubelet\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovn-node-metrics-cert\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996881 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-cni-bin\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996896 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-run-systemd\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996812 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996940 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-host\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-cni-netd\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-hosts-file\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.996835 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df30c3d2-0c10-4a19-94e7-a09f60737213-tmp\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.997006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24jmp\" (UniqueName: \"kubernetes.io/projected/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-kube-api-access-24jmp\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.997064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-cni-multus\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:06.999617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.997146 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-host-kubelet\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:07.000403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.997206 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df30c3d2-0c10-4a19-94e7-a09f60737213-sys\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:07.000403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.997261 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-env-overrides\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:07.000403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.997587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-tmp-dir\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:07.000403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.998191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/df30c3d2-0c10-4a19-94e7-a09f60737213-etc-tuned\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:07.000403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.998656 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282-agent-certs\") pod \"konnectivity-agent-mhcgl\" (UID: \"01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282\") " pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:07.000403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.999225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-ovn-node-metrics-cert\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:07.000403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:06.999664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df30c3d2-0c10-4a19-94e7-a09f60737213-tmp\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:07.002649 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.002620 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:07.002788 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.002661 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:07.002788 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.002676 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7m79h for pod openshift-network-diagnostics/network-check-target-cxlg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:07.002788 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.002768 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h podName:26dbb6ee-4d95-468d-aafe-1bc2e96c41f1 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:07.502731874 +0000 UTC m=+3.081862604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7m79h" (UniqueName: "kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h") pod "network-check-target-cxlg6" (UID: "26dbb6ee-4d95-468d-aafe-1bc2e96c41f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:07.005539 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.005420 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24jmp\" (UniqueName: \"kubernetes.io/projected/498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf-kube-api-access-24jmp\") pod \"node-resolver-w2vns\" (UID: \"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf\") " pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:07.006060 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.005997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfrqf\" (UniqueName: \"kubernetes.io/projected/b7453d4e-eec7-403a-9cc6-a6a9daa972b0-kube-api-access-dfrqf\") pod \"aws-ebs-csi-driver-node-47nvw\" (UID: \"b7453d4e-eec7-403a-9cc6-a6a9daa972b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:07.006254 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.006232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgqcp\" (UniqueName: \"kubernetes.io/projected/282d56ec-09a3-4c2e-a098-e8271c1f2147-kube-api-access-pgqcp\") pod \"multus-additional-cni-plugins-6gg86\" (UID: \"282d56ec-09a3-4c2e-a098-e8271c1f2147\") " pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:07.009866 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.007139 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cjct\" (UniqueName: \"kubernetes.io/projected/fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7-kube-api-access-6cjct\") pod \"ovnkube-node-p8dz7\" (UID: \"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:07.009866 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.007156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqpw\" (UniqueName: \"kubernetes.io/projected/df30c3d2-0c10-4a19-94e7-a09f60737213-kube-api-access-rgqpw\") pod \"tuned-c9tz8\" (UID: \"df30c3d2-0c10-4a19-94e7-a09f60737213\") " pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:07.097704 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-kubelet\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.097704 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097711 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-hostroot\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.097926 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097759 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-hostroot\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.097926 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097773 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-kubelet\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.097926 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097824 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rlq6\" (UniqueName: \"kubernetes.io/projected/44c25c59-0494-490c-9cea-82d3c5d19215-kube-api-access-5rlq6\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.097926 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tskdg\" (UniqueName: \"kubernetes.io/projected/00d87433-8bc9-4d18-bab8-e6f889a4b52d-kube-api-access-tskdg\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:07.097926 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-cni-multus\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-cni-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-os-release\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-cni-bin\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02807a83-8fd7-42bf-a119-50f4017a2833-iptables-alerter-script\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.097986 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-cni-multus\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-k8s-cni-cncf-io\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098047 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-os-release\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098069 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-cni-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098084 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-var-lib-cni-bin\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/44c25c59-0494-490c-9cea-82d3c5d19215-multus-daemon-config\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-k8s-cni-cncf-io\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02807a83-8fd7-42bf-a119-50f4017a2833-host-slash\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098218 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj58m\" (UniqueName: \"kubernetes.io/projected/02807a83-8fd7-42bf-a119-50f4017a2833-kube-api-access-wj58m\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098239 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02807a83-8fd7-42bf-a119-50f4017a2833-host-slash\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098248 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44c25c59-0494-490c-9cea-82d3c5d19215-cni-binary-copy\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.098270 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-netns\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098310 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-socket-dir-parent\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.098347 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:07.598326178 +0000 UTC m=+3.177456922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098374 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-socket-dir-parent\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-netns\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098473 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-conf-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-multus-certs\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e92a775-6760-473c-a919-b7e7bcf242c5-serviceca\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098573 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-multus-conf-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098576 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fs95\" (UniqueName: \"kubernetes.io/projected/6e92a775-6760-473c-a919-b7e7bcf242c5-kube-api-access-9fs95\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-etc-kubernetes\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.098720 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098637 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02807a83-8fd7-42bf-a119-50f4017a2833-iptables-alerter-script\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e92a775-6760-473c-a919-b7e7bcf242c5-host\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098669 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-system-cni-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098690 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-etc-kubernetes\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098691 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44c25c59-0494-490c-9cea-82d3c5d19215-cni-binary-copy\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098577 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-host-run-multus-certs\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098693 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-cnibin\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098720 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e92a775-6760-473c-a919-b7e7bcf242c5-host\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098723 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/44c25c59-0494-490c-9cea-82d3c5d19215-multus-daemon-config\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-cnibin\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098756 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44c25c59-0494-490c-9cea-82d3c5d19215-system-cni-dir\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.099248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.098908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e92a775-6760-473c-a919-b7e7bcf242c5-serviceca\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:07.106518 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.106434 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rlq6\" (UniqueName: \"kubernetes.io/projected/44c25c59-0494-490c-9cea-82d3c5d19215-kube-api-access-5rlq6\") pod \"multus-dwf52\" (UID: \"44c25c59-0494-490c-9cea-82d3c5d19215\") " pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.106518 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.106495 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fs95\" (UniqueName: \"kubernetes.io/projected/6e92a775-6760-473c-a919-b7e7bcf242c5-kube-api-access-9fs95\") pod \"node-ca-szsmw\" (UID: \"6e92a775-6760-473c-a919-b7e7bcf242c5\") " pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:07.107179 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.107157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj58m\" (UniqueName: \"kubernetes.io/projected/02807a83-8fd7-42bf-a119-50f4017a2833-kube-api-access-wj58m\") pod \"iptables-alerter-n8ccr\" (UID: \"02807a83-8fd7-42bf-a119-50f4017a2833\") " pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:07.107277 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.107249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tskdg\" (UniqueName: \"kubernetes.io/projected/00d87433-8bc9-4d18-bab8-e6f889a4b52d-kube-api-access-tskdg\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:07.177304 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.177258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w2vns" Apr 17 11:16:07.185272 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.185240 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6gg86" Apr 17 11:16:07.196162 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.196117 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:07.201054 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.201020 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:07.208810 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.208778 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" Apr 17 11:16:07.216617 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.216587 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" Apr 17 11:16:07.223329 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.223293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-szsmw" Apr 17 11:16:07.229142 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.229115 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dwf52" Apr 17 11:16:07.235800 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.235768 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n8ccr" Apr 17 11:16:07.285775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.285741 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:07.601299 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.601257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:07.601545 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.601337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:07.601545 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.601451 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:07.601545 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.601455 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:07.601545 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.601481 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:07.601545 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.601492 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7m79h for pod openshift-network-diagnostics/network-check-target-cxlg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:07.601545 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.601520 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:08.601502569 +0000 UTC m=+4.180633318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:07.601545 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.601542 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h podName:26dbb6ee-4d95-468d-aafe-1bc2e96c41f1 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:08.601530429 +0000 UTC m=+4.180661175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m79h" (UniqueName: "kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h") pod "network-check-target-cxlg6" (UID: "26dbb6ee-4d95-468d-aafe-1bc2e96c41f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:07.643750 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.643717 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e92a775_6760_473c_a919_b7e7bcf242c5.slice/crio-a60da437d5a7ba961aac216b17b9f557f1d8d5c256fc932f0c0184a23d8c0caf WatchSource:0}: Error finding container a60da437d5a7ba961aac216b17b9f557f1d8d5c256fc932f0c0184a23d8c0caf: Status 404 returned error can't find the container with id a60da437d5a7ba961aac216b17b9f557f1d8d5c256fc932f0c0184a23d8c0caf Apr 17 11:16:07.645880 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.645852 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf30c3d2_0c10_4a19_94e7_a09f60737213.slice/crio-527452f79b3e8f32639f77cef15f6145959f2b621b881bf78a05cac0444fbb6d WatchSource:0}: Error finding container 527452f79b3e8f32639f77cef15f6145959f2b621b881bf78a05cac0444fbb6d: Status 404 returned error can't find the container with id 527452f79b3e8f32639f77cef15f6145959f2b621b881bf78a05cac0444fbb6d Apr 17 11:16:07.670214 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.669685 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7453d4e_eec7_403a_9cc6_a6a9daa972b0.slice/crio-18d63ed4163956ea9463bdf1cecf8fd5b90cabf2c90e421947dd48e22aa8f224 WatchSource:0}: Error finding container 18d63ed4163956ea9463bdf1cecf8fd5b90cabf2c90e421947dd48e22aa8f224: Status 404 returned error can't find the container with id 18d63ed4163956ea9463bdf1cecf8fd5b90cabf2c90e421947dd48e22aa8f224 Apr 17 11:16:07.672377 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.672293 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod498d4ba7_6d35_426a_ab1f_fbd6ce54d9bf.slice/crio-bbc7a7253dc0ffec0336bd9c75726f5ed12a850d9e291b7d263eb56cd690898f WatchSource:0}: Error finding container bbc7a7253dc0ffec0336bd9c75726f5ed12a850d9e291b7d263eb56cd690898f: Status 404 returned error can't find the container with id bbc7a7253dc0ffec0336bd9c75726f5ed12a850d9e291b7d263eb56cd690898f Apr 17 11:16:07.673526 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.673498 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b17b1c_2eb7_4ac2_bfa8_1dd57a5dd282.slice/crio-6c9392ca7bc95649ed6c6ad532d115a55e0e7050dac8dc81d12e75f7f1767da4 WatchSource:0}: Error finding container 6c9392ca7bc95649ed6c6ad532d115a55e0e7050dac8dc81d12e75f7f1767da4: Status 404 returned error can't find the container with id 6c9392ca7bc95649ed6c6ad532d115a55e0e7050dac8dc81d12e75f7f1767da4 Apr 17 11:16:07.674024 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.673979 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe841d65_b3e0_4db9_8f30_f2b6baf3f1c7.slice/crio-287423b597daab09e73f980f247f5f1f2ed176a11a6b4ff9914f05694abd1937 WatchSource:0}: Error finding container 287423b597daab09e73f980f247f5f1f2ed176a11a6b4ff9914f05694abd1937: Status 404 returned error can't find the container with id 287423b597daab09e73f980f247f5f1f2ed176a11a6b4ff9914f05694abd1937 Apr 17 11:16:07.675713 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.675691 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02807a83_8fd7_42bf_a119_50f4017a2833.slice/crio-b1ffc83379902c4f75f55796c599a2d2304c0d8836ad4c871f0df44b9fdad49d WatchSource:0}: Error finding container b1ffc83379902c4f75f55796c599a2d2304c0d8836ad4c871f0df44b9fdad49d: Status 404 returned error can't find the container with id b1ffc83379902c4f75f55796c599a2d2304c0d8836ad4c871f0df44b9fdad49d Apr 17 11:16:07.677625 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.677120 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c25c59_0494_490c_9cea_82d3c5d19215.slice/crio-0817c5095fd336c825c05c4885cafec6bff51d530a9e80d5a24e17f2c4c9b0fb WatchSource:0}: Error finding container 0817c5095fd336c825c05c4885cafec6bff51d530a9e80d5a24e17f2c4c9b0fb: Status 404 returned error can't find the container with id 0817c5095fd336c825c05c4885cafec6bff51d530a9e80d5a24e17f2c4c9b0fb Apr 17 11:16:07.678808 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:16:07.678425 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod282d56ec_09a3_4c2e_a098_e8271c1f2147.slice/crio-ce6f25c8a8525a6b6b271ef414e229b733850432ece43fb4eb3cae8bf054a043 WatchSource:0}: Error finding container ce6f25c8a8525a6b6b271ef414e229b733850432ece43fb4eb3cae8bf054a043: Status 404 returned error can't find the container with id ce6f25c8a8525a6b6b271ef414e229b733850432ece43fb4eb3cae8bf054a043 Apr 17 11:16:07.884447 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.884176 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dhjrk"] Apr 17 11:16:07.887284 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.887258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:07.887412 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:07.887330 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:07.941164 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.941109 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:05 +0000 UTC" deadline="2027-10-01 14:48:30.708101415 +0000 UTC" Apr 17 11:16:07.941164 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.941158 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12771h32m22.766946307s" Apr 17 11:16:07.990743 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.990703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gg86" event={"ID":"282d56ec-09a3-4c2e-a098-e8271c1f2147","Type":"ContainerStarted","Data":"ce6f25c8a8525a6b6b271ef414e229b733850432ece43fb4eb3cae8bf054a043"} Apr 17 11:16:07.992037 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.992002 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n8ccr" event={"ID":"02807a83-8fd7-42bf-a119-50f4017a2833","Type":"ContainerStarted","Data":"b1ffc83379902c4f75f55796c599a2d2304c0d8836ad4c871f0df44b9fdad49d"} Apr 17 11:16:07.993280 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.993244 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"287423b597daab09e73f980f247f5f1f2ed176a11a6b4ff9914f05694abd1937"} Apr 17 11:16:07.994497 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.994462 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mhcgl" event={"ID":"01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282","Type":"ContainerStarted","Data":"6c9392ca7bc95649ed6c6ad532d115a55e0e7050dac8dc81d12e75f7f1767da4"} Apr 17 11:16:07.995919 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.995886 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w2vns" event={"ID":"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf","Type":"ContainerStarted","Data":"bbc7a7253dc0ffec0336bd9c75726f5ed12a850d9e291b7d263eb56cd690898f"} Apr 17 11:16:07.997309 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.997278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" event={"ID":"b7453d4e-eec7-403a-9cc6-a6a9daa972b0","Type":"ContainerStarted","Data":"18d63ed4163956ea9463bdf1cecf8fd5b90cabf2c90e421947dd48e22aa8f224"} Apr 17 11:16:07.998334 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:07.998309 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" event={"ID":"df30c3d2-0c10-4a19-94e7-a09f60737213","Type":"ContainerStarted","Data":"527452f79b3e8f32639f77cef15f6145959f2b621b881bf78a05cac0444fbb6d"} Apr 17 11:16:08.000425 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.000397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" event={"ID":"0ef42997bfb50561dbaa710ba959617d","Type":"ContainerStarted","Data":"9f1243711593a4731621c9d7840aceb2d5d8716c7b23eb2bf4182cec02284d93"} Apr 17 11:16:08.001638 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.001611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dwf52" event={"ID":"44c25c59-0494-490c-9cea-82d3c5d19215","Type":"ContainerStarted","Data":"0817c5095fd336c825c05c4885cafec6bff51d530a9e80d5a24e17f2c4c9b0fb"} Apr 17 11:16:08.002701 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.002677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-szsmw" event={"ID":"6e92a775-6760-473c-a919-b7e7bcf242c5","Type":"ContainerStarted","Data":"a60da437d5a7ba961aac216b17b9f557f1d8d5c256fc932f0c0184a23d8c0caf"} Apr 17 11:16:08.004734 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.004698 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9ac73cd-3758-404e-bd44-1926d9b9ac58-kubelet-config\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.004847 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.004831 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9ac73cd-3758-404e-bd44-1926d9b9ac58-dbus\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.004898 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.004858 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.016879 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.016813 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-230.ec2.internal" podStartSLOduration=2.016793419 podStartE2EDuration="2.016793419s" podCreationTimestamp="2026-04-17 11:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:08.016443635 +0000 UTC m=+3.595574383" watchObservedRunningTime="2026-04-17 11:16:08.016793419 +0000 UTC m=+3.595924168" Apr 17 11:16:08.105383 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.105308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9ac73cd-3758-404e-bd44-1926d9b9ac58-dbus\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.105383 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.105388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.105630 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.105431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9ac73cd-3758-404e-bd44-1926d9b9ac58-kubelet-config\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.105630 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.105538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9ac73cd-3758-404e-bd44-1926d9b9ac58-kubelet-config\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.105708 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.105686 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9ac73cd-3758-404e-bd44-1926d9b9ac58-dbus\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.105796 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.105780 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:08.105884 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.105846 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret podName:e9ac73cd-3758-404e-bd44-1926d9b9ac58 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:08.605825118 +0000 UTC m=+4.184955856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret") pod "global-pull-secret-syncer-dhjrk" (UID: "e9ac73cd-3758-404e-bd44-1926d9b9ac58") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.609871 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.609933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.609981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.610147 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.610166 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.610179 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7m79h for pod openshift-network-diagnostics/network-check-target-cxlg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.610238 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h podName:26dbb6ee-4d95-468d-aafe-1bc2e96c41f1 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:10.610220378 +0000 UTC m=+6.189351115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m79h" (UniqueName: "kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h") pod "network-check-target-cxlg6" (UID: "26dbb6ee-4d95-468d-aafe-1bc2e96c41f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.610695 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.610749 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:10.610732877 +0000 UTC m=+6.189863605 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.610805 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:08.610878 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.610835 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret podName:e9ac73cd-3758-404e-bd44-1926d9b9ac58 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:09.610824861 +0000 UTC m=+5.189955589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret") pod "global-pull-secret-syncer-dhjrk" (UID: "e9ac73cd-3758-404e-bd44-1926d9b9ac58") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:08.982298 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.981551 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:08.982298 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.981696 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:08.982298 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.982144 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:08.982298 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.982242 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:08.983257 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:08.983101 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:08.983257 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:08.983205 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:09.016418 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:09.015939 2579 generic.go:358] "Generic (PLEG): container finished" podID="53ca2259f548b65113e09992cb274ca5" containerID="5c33513c343fe42fcfc323c86144d3c8e00efea4b4c07450c51c9b4cd0a9eb77" exitCode=0 Apr 17 11:16:09.017514 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:09.017408 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" event={"ID":"53ca2259f548b65113e09992cb274ca5","Type":"ContainerDied","Data":"5c33513c343fe42fcfc323c86144d3c8e00efea4b4c07450c51c9b4cd0a9eb77"} Apr 17 11:16:09.619567 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:09.618818 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:09.619567 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:09.619030 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:09.619567 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:09.619099 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret podName:e9ac73cd-3758-404e-bd44-1926d9b9ac58 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.619078356 +0000 UTC m=+7.198209083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret") pod "global-pull-secret-syncer-dhjrk" (UID: "e9ac73cd-3758-404e-bd44-1926d9b9ac58") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:10.024118 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:10.023310 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" event={"ID":"53ca2259f548b65113e09992cb274ca5","Type":"ContainerStarted","Data":"8c1feabe8d37a778d3eff9865c2d2b40fdc8b3d54a55310119f3fe8d12b4bdf1"} Apr 17 11:16:10.038767 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:10.038115 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-230.ec2.internal" podStartSLOduration=4.03809246 podStartE2EDuration="4.03809246s" podCreationTimestamp="2026-04-17 11:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:10.03724453 +0000 UTC m=+5.616375277" watchObservedRunningTime="2026-04-17 11:16:10.03809246 +0000 UTC m=+5.617223208" Apr 17 11:16:10.626454 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:10.626414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:10.626659 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:10.626497 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:10.626746 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.626662 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:10.626746 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.626680 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:10.626746 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.626693 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7m79h for pod openshift-network-diagnostics/network-check-target-cxlg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:10.626891 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.626751 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h podName:26dbb6ee-4d95-468d-aafe-1bc2e96c41f1 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.626733618 +0000 UTC m=+10.205864348 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m79h" (UniqueName: "kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h") pod "network-check-target-cxlg6" (UID: "26dbb6ee-4d95-468d-aafe-1bc2e96c41f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:10.627183 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.627160 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:10.627273 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.627222 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.627205243 +0000 UTC m=+10.206335971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:10.980445 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:10.980348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:10.980445 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:10.980409 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:10.980675 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:10.980348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:10.980675 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.980508 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:10.980675 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.980578 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:10.980675 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:10.980637 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:11.635601 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:11.635556 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:11.636052 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:11.635704 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:11.636052 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:11.635770 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret podName:e9ac73cd-3758-404e-bd44-1926d9b9ac58 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:15.635751917 +0000 UTC m=+11.214882656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret") pod "global-pull-secret-syncer-dhjrk" (UID: "e9ac73cd-3758-404e-bd44-1926d9b9ac58") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:12.980592 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:12.980519 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:12.981061 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:12.980639 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:12.981061 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:12.980656 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:12.981061 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:12.980701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:12.981061 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:12.980811 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:12.981061 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:12.980942 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:14.663614 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:14.663570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:14.664085 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:14.663655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:14.664085 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.663713 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.664085 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.663800 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:22.663770718 +0000 UTC m=+18.242901483 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.664085 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.663800 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:14.664085 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.663834 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:14.664085 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.663849 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7m79h for pod openshift-network-diagnostics/network-check-target-cxlg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.664085 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.663884 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h podName:26dbb6ee-4d95-468d-aafe-1bc2e96c41f1 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:22.663873067 +0000 UTC m=+18.243003798 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m79h" (UniqueName: "kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h") pod "network-check-target-cxlg6" (UID: "26dbb6ee-4d95-468d-aafe-1bc2e96c41f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.981412 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:14.980967 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:14.981412 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:14.981018 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:14.981412 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.981086 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:14.981412 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.981116 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:14.981412 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:14.981199 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:14.981412 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:14.981307 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:15.671777 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:15.671658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:15.672320 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:15.671807 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:15.672320 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:15.671904 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret podName:e9ac73cd-3758-404e-bd44-1926d9b9ac58 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:23.671883143 +0000 UTC m=+19.251013875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret") pod "global-pull-secret-syncer-dhjrk" (UID: "e9ac73cd-3758-404e-bd44-1926d9b9ac58") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:16.982922 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:16.982835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:16.983422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:16.982836 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:16.983422 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:16.982952 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:16.983422 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:16.983027 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:16.983422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:16.983067 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:16.983422 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:16.983135 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:18.980091 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:18.980056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:18.980591 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:18.980056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:18.980591 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:18.980188 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:18.980591 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:18.980284 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:18.980591 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:18.980056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:18.980591 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:18.980399 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:20.982875 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:20.982843 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:20.983314 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:20.982846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:20.983314 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:20.982977 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:20.983314 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:20.982846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:20.983314 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:20.983074 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:20.983314 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:20.983154 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:22.725391 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:22.725329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:22.725836 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.725482 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:22.725836 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:22.725557 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:22.725836 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.725585 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:38.72556318 +0000 UTC m=+34.304693927 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:22.725836 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.725679 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:22.725836 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.725697 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:22.725836 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.725710 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7m79h for pod openshift-network-diagnostics/network-check-target-cxlg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:22.725836 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.725770 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h podName:26dbb6ee-4d95-468d-aafe-1bc2e96c41f1 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:38.725753998 +0000 UTC m=+34.304884722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m79h" (UniqueName: "kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h") pod "network-check-target-cxlg6" (UID: "26dbb6ee-4d95-468d-aafe-1bc2e96c41f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:22.982495 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:22.982415 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:22.982715 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:22.982415 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:22.982715 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.982516 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:22.982715 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:22.982415 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:22.982715 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.982596 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:22.982715 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:22.982689 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:23.732797 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:23.732751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:23.733248 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:23.732911 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:23.733248 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:23.732990 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret podName:e9ac73cd-3758-404e-bd44-1926d9b9ac58 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:39.732969334 +0000 UTC m=+35.312100070 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret") pod "global-pull-secret-syncer-dhjrk" (UID: "e9ac73cd-3758-404e-bd44-1926d9b9ac58") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:24.980611 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:24.980345 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:24.981328 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:24.980423 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:24.981328 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:24.980724 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:24.981328 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:24.980451 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:24.981328 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:24.980844 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:24.981328 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:24.980916 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:25.049705 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.049674 2579 generic.go:358] "Generic (PLEG): container finished" podID="282d56ec-09a3-4c2e-a098-e8271c1f2147" containerID="c69a6f78856cdf941a6b5da7fc8fbea01954ac3d44b7d1c88369afcf42b6e817" exitCode=0 Apr 17 11:16:25.049862 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.049748 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gg86" event={"ID":"282d56ec-09a3-4c2e-a098-e8271c1f2147","Type":"ContainerDied","Data":"c69a6f78856cdf941a6b5da7fc8fbea01954ac3d44b7d1c88369afcf42b6e817"} Apr 17 11:16:25.051287 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.051182 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"2e834a307220877db7aa09501e7895e1219c3336cf719f8cdc440a8c889fa0a5"} Apr 17 11:16:25.052755 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.052724 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mhcgl" event={"ID":"01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282","Type":"ContainerStarted","Data":"2f232abd3b4bdc5c4e4d9a15e0f840e5af416a7e76ef8e2cf2018bbe74141616"} Apr 17 11:16:25.054283 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.054173 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w2vns" event={"ID":"498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf","Type":"ContainerStarted","Data":"85e99beb4700422658a72190e1fecb7f2766d5b701e66540089fa774bf726300"} Apr 17 11:16:25.055913 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.055890 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" event={"ID":"b7453d4e-eec7-403a-9cc6-a6a9daa972b0","Type":"ContainerStarted","Data":"20f4ba56ab1887b42dc3015e89f1345fa57199da6ae4c4d54819bcf481724792"} Apr 17 11:16:25.057249 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.057225 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" event={"ID":"df30c3d2-0c10-4a19-94e7-a09f60737213","Type":"ContainerStarted","Data":"f07b420c2b9320b6bddb1d2182b181bb0e8934ac606322ea0935f21ef1c41320"} Apr 17 11:16:25.058629 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.058606 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dwf52" event={"ID":"44c25c59-0494-490c-9cea-82d3c5d19215","Type":"ContainerStarted","Data":"8fb94beb3d0e61b5eef42aeedf34b3d72aed33beaaccd7fe4db389ec08020722"} Apr 17 11:16:25.059838 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.059813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-szsmw" event={"ID":"6e92a775-6760-473c-a919-b7e7bcf242c5","Type":"ContainerStarted","Data":"d9b23289e781b56a38007d2b4a72d74e806725bb4bb8e8251e2b9e708b196a70"} Apr 17 11:16:25.083367 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.083307 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-c9tz8" podStartSLOduration=3.247086878 podStartE2EDuration="20.083291096s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.666917091 +0000 UTC m=+3.246047821" lastFinishedPulling="2026-04-17 11:16:24.503121297 +0000 UTC m=+20.082252039" observedRunningTime="2026-04-17 11:16:25.08282727 +0000 UTC m=+20.661958016" watchObservedRunningTime="2026-04-17 11:16:25.083291096 +0000 UTC m=+20.662421842" Apr 17 11:16:25.098588 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.098540 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dwf52" podStartSLOduration=3.234505877 podStartE2EDuration="20.098516576s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.67895353 +0000 UTC m=+3.258084255" lastFinishedPulling="2026-04-17 11:16:24.542964228 +0000 UTC m=+20.122094954" observedRunningTime="2026-04-17 11:16:25.097968149 +0000 UTC m=+20.677098898" watchObservedRunningTime="2026-04-17 11:16:25.098516576 +0000 UTC m=+20.677647322" Apr 17 11:16:25.150004 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.149794 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w2vns" podStartSLOduration=3.326005745 podStartE2EDuration="20.149773508s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.677863835 +0000 UTC m=+3.256994577" lastFinishedPulling="2026-04-17 11:16:24.501631614 +0000 UTC m=+20.080762340" observedRunningTime="2026-04-17 11:16:25.118811586 +0000 UTC m=+20.697942336" watchObservedRunningTime="2026-04-17 11:16:25.149773508 +0000 UTC m=+20.728904256" Apr 17 11:16:25.167772 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.167719 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mhcgl" podStartSLOduration=3.343562216 podStartE2EDuration="20.167702035s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.677470779 +0000 UTC m=+3.256601504" lastFinishedPulling="2026-04-17 11:16:24.501610592 +0000 UTC m=+20.080741323" observedRunningTime="2026-04-17 11:16:25.152798033 +0000 UTC m=+20.731928801" watchObservedRunningTime="2026-04-17 11:16:25.167702035 +0000 UTC m=+20.746832803" Apr 17 11:16:25.168218 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:25.168182 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-szsmw" podStartSLOduration=3.333333382 podStartE2EDuration="20.168171951s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.666761337 +0000 UTC m=+3.245892062" lastFinishedPulling="2026-04-17 11:16:24.501599891 +0000 UTC m=+20.080730631" observedRunningTime="2026-04-17 11:16:25.167415179 +0000 UTC m=+20.746545924" watchObservedRunningTime="2026-04-17 11:16:25.168171951 +0000 UTC m=+20.747302698" Apr 17 11:16:26.065811 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.065777 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:16:26.066414 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.066167 2579 generic.go:358] "Generic (PLEG): container finished" podID="fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7" containerID="cde208c59705aa00a4f9e9e27075ce6460824014a6dc7a5a69d70df8bf3ed7ad" exitCode=1 Apr 17 11:16:26.066414 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.066296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"e7147f78cbd1120fe6d56e6622c334061225d9465d7ac2f8008c06feab996838"} Apr 17 11:16:26.066414 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.066334 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"4aa1fa291561a9a3ed4feab0c28c842120adfd9d28e46b397017a6f5acdf919e"} Apr 17 11:16:26.066414 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.066348 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"998e1a29bf066d4e7ede18e4b151c34da017811faea1d84dd81f7927e519f482"} Apr 17 11:16:26.066414 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.066377 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"9ec6921db3aca2d16d4ae35ba0c690345128944473367337bff7b28cee8bacdc"} Apr 17 11:16:26.066414 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.066390 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerDied","Data":"cde208c59705aa00a4f9e9e27075ce6460824014a6dc7a5a69d70df8bf3ed7ad"} Apr 17 11:16:26.085719 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.085674 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:26.974909 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.974693 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:26.085696683Z","UUID":"8a5b7ad2-79f5-43f2-9f8c-06c6e6ffd775","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:26.976828 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.976801 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:26.976828 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.976837 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:26.983450 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.983420 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:26.983614 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:26.983545 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:26.983614 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.983568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:26.983746 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:26.983614 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:26.983746 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:26.983689 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:26.983868 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:26.983771 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:27.070680 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:27.070637 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n8ccr" event={"ID":"02807a83-8fd7-42bf-a119-50f4017a2833","Type":"ContainerStarted","Data":"8ea99890724a865a009d60693e4beefb6777213922c78ac9fa136d8e8df22478"} Apr 17 11:16:27.072583 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:27.072542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" event={"ID":"b7453d4e-eec7-403a-9cc6-a6a9daa972b0","Type":"ContainerStarted","Data":"f80c4b2b627c86054e0f9a15efff4e67654ba05934f541092de01364be386d1d"} Apr 17 11:16:27.098643 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:27.098583 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-n8ccr" podStartSLOduration=5.275318106 podStartE2EDuration="22.098567888s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.678689558 +0000 UTC m=+3.257820295" lastFinishedPulling="2026-04-17 11:16:24.501939345 +0000 UTC m=+20.081070077" observedRunningTime="2026-04-17 11:16:27.097837428 +0000 UTC m=+22.676968176" watchObservedRunningTime="2026-04-17 11:16:27.098567888 +0000 UTC m=+22.677698657" Apr 17 11:16:28.077137 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:28.076954 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" event={"ID":"b7453d4e-eec7-403a-9cc6-a6a9daa972b0","Type":"ContainerStarted","Data":"4129e4d1e2c3146e827fa3a6a16f737a61f96396b69bca2e8d1631abc368d4fb"} Apr 17 11:16:28.094428 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:28.094377 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-47nvw" podStartSLOduration=3.342206914 podStartE2EDuration="23.094343528s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.677835093 +0000 UTC m=+3.256965824" lastFinishedPulling="2026-04-17 11:16:27.429971699 +0000 UTC m=+23.009102438" observedRunningTime="2026-04-17 11:16:28.093731072 +0000 UTC m=+23.672861811" watchObservedRunningTime="2026-04-17 11:16:28.094343528 +0000 UTC m=+23.673474267" Apr 17 11:16:28.980606 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:28.980556 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:28.980788 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:28.980692 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:28.980788 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:28.980697 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:28.980916 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:28.980813 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:28.980916 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:28.980860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:28.981015 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:28.980966 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:29.627022 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:29.626993 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:29.627618 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:29.627510 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:30.082448 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:30.082408 2579 generic.go:358] "Generic (PLEG): container finished" podID="282d56ec-09a3-4c2e-a098-e8271c1f2147" containerID="d6439816945eb5ee68bfe7b6f5d7e708e32205a1e5567b9b3f27336e32889e47" exitCode=0 Apr 17 11:16:30.082615 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:30.082486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gg86" event={"ID":"282d56ec-09a3-4c2e-a098-e8271c1f2147","Type":"ContainerDied","Data":"d6439816945eb5ee68bfe7b6f5d7e708e32205a1e5567b9b3f27336e32889e47"} Apr 17 11:16:30.085405 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:30.085379 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:16:30.085801 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:30.085779 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"6b0995147215690aeb10de5f8b49b515e275a857a6790744b680fc38be6fd49d"} Apr 17 11:16:30.980422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:30.980333 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:30.980883 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:30.980343 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:30.980883 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:30.980394 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:30.980883 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:30.980583 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:30.980883 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:30.980658 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:30.980883 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:30.980728 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:31.090052 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:31.090021 2579 generic.go:358] "Generic (PLEG): container finished" podID="282d56ec-09a3-4c2e-a098-e8271c1f2147" containerID="c1915ed38f839b9b328717383c6e3146319e4fb30ad6c3493dbc71668aaa5a90" exitCode=0 Apr 17 11:16:31.090227 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:31.090075 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gg86" event={"ID":"282d56ec-09a3-4c2e-a098-e8271c1f2147","Type":"ContainerDied","Data":"c1915ed38f839b9b328717383c6e3146319e4fb30ad6c3493dbc71668aaa5a90"} Apr 17 11:16:32.094097 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.093910 2579 generic.go:358] "Generic (PLEG): container finished" podID="282d56ec-09a3-4c2e-a098-e8271c1f2147" containerID="1f1d258d0c7c81fe3b729792154d9fb9011281990ed87b279465ace30fa002fd" exitCode=0 Apr 17 11:16:32.094652 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.094005 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gg86" event={"ID":"282d56ec-09a3-4c2e-a098-e8271c1f2147","Type":"ContainerDied","Data":"1f1d258d0c7c81fe3b729792154d9fb9011281990ed87b279465ace30fa002fd"} Apr 17 11:16:32.097307 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.097289 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:16:32.097699 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.097676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"b9173cd0ac0dd161c71ffaac1e798a588e47beb55aeadcdb77260ea54c77772d"} Apr 17 11:16:32.097907 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.097886 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:32.097988 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.097914 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:32.098090 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.098075 2579 scope.go:117] "RemoveContainer" containerID="cde208c59705aa00a4f9e9e27075ce6460824014a6dc7a5a69d70df8bf3ed7ad" Apr 17 11:16:32.116625 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.116603 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:32.760332 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.760296 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:32.760620 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.760452 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:32.761000 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.760970 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mhcgl" Apr 17 11:16:32.979893 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.979711 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:32.979893 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.979752 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:32.979893 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:32.979838 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:32.979893 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:32.979860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:32.980217 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:32.979968 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:32.980217 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:32.980045 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:33.103818 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.103741 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:16:33.104264 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.104117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" event={"ID":"fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7","Type":"ContainerStarted","Data":"6d9d0ba275e9e18e065bb9d987b9f5226aae69b7606961aac66f4f4d1f3739b0"} Apr 17 11:16:33.104480 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.104460 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:33.122309 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.122278 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:16:33.146936 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.146874 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" podStartSLOduration=10.966224216 podStartE2EDuration="28.146854162s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.677884655 +0000 UTC m=+3.257015383" lastFinishedPulling="2026-04-17 11:16:24.858514601 +0000 UTC m=+20.437645329" observedRunningTime="2026-04-17 11:16:33.146157865 +0000 UTC m=+28.725288605" watchObservedRunningTime="2026-04-17 11:16:33.146854162 +0000 UTC m=+28.725984910" Apr 17 11:16:33.761784 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.761751 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dhjrk"] Apr 17 11:16:33.761965 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.761874 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:33.761965 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:33.761952 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:33.770675 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.770643 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n2dw9"] Apr 17 11:16:33.770836 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.770773 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:33.770903 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:33.770878 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:33.772904 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.772877 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cxlg6"] Apr 17 11:16:33.773032 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:33.773011 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:33.773141 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:33.773110 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:34.981104 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:34.981022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:34.981536 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:34.981114 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:35.980185 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:35.979993 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:35.980385 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:35.979993 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:35.980385 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:35.980307 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:35.980509 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:35.980385 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:36.980299 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:36.980265 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:36.980746 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:36.980387 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxlg6" podUID="26dbb6ee-4d95-468d-aafe-1bc2e96c41f1" Apr 17 11:16:37.979651 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:37.979622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:37.979797 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:37.979622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:37.979797 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:37.979740 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:16:37.979873 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:37.979799 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhjrk" podUID="e9ac73cd-3758-404e-bd44-1926d9b9ac58" Apr 17 11:16:38.116637 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.116607 2579 generic.go:358] "Generic (PLEG): container finished" podID="282d56ec-09a3-4c2e-a098-e8271c1f2147" containerID="c33aef3e65801b83ae91b1dc41657185623e81a609856a75b308937edad1e77a" exitCode=0 Apr 17 11:16:38.117303 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.116673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gg86" event={"ID":"282d56ec-09a3-4c2e-a098-e8271c1f2147","Type":"ContainerDied","Data":"c33aef3e65801b83ae91b1dc41657185623e81a609856a75b308937edad1e77a"} Apr 17 11:16:38.204578 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.204514 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-230.ec2.internal" event="NodeReady" Apr 17 11:16:38.204734 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.204679 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:38.250809 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.250781 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8b9d5b99-7ms26"] Apr 17 11:16:38.263108 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.263086 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.266043 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.265990 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:16:38.266811 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.266764 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-l4t4t\"" Apr 17 11:16:38.266811 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.266804 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:16:38.267248 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.267228 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:16:38.268677 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.268652 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8b9d5b99-7ms26"] Apr 17 11:16:38.272955 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.272917 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jv8wn"] Apr 17 11:16:38.273403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.273382 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:16:38.297741 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.297716 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v9vzq"] Apr 17 11:16:38.297901 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.297882 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:38.303685 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.303666 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-whkkk\"" Apr 17 11:16:38.304528 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.304510 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:38.304622 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.304602 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:38.309041 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.309026 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:38.318232 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.318210 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jv8wn"] Apr 17 11:16:38.318232 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.318233 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v9vzq"] Apr 17 11:16:38.318367 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.318341 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.325801 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.325784 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvwdc\"" Apr 17 11:16:38.325801 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.325796 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:38.326462 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.326446 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:38.346475 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.346451 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-installation-pull-secrets\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.346589 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.346489 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-ca-trust-extracted\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.346589 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.346509 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-certificates\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.346589 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.346530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dhlr\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-kube-api-access-9dhlr\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.346732 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.346595 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-bound-sa-token\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.346732 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.346640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-trusted-ca\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.346812 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.346759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-image-registry-private-configuration\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.346812 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.346799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.447514 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447483 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-trusted-ca\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.447514 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447523 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3d7992a-df2f-42c4-b112-16554731e7e3-config-volume\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447540 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3d7992a-df2f-42c4-b112-16554731e7e3-tmp-dir\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447590 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z749z\" (UniqueName: \"kubernetes.io/projected/d3d7992a-df2f-42c4-b112-16554731e7e3-kube-api-access-z749z\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-image-registry-private-configuration\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-installation-pull-secrets\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-ca-trust-extracted\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.447736 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:38.447773 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.447761 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8b9d5b99-7ms26: secret "image-registry-tls" not found Apr 17 11:16:38.448189 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.447820 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls podName:bde7b410-7e1b-49fd-9dc0-496cc0d7662c nodeName:}" failed. No retries permitted until 2026-04-17 11:16:38.947800855 +0000 UTC m=+34.526931583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls") pod "image-registry-8b9d5b99-7ms26" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c") : secret "image-registry-tls" not found Apr 17 11:16:38.448189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447737 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b72wn\" (UniqueName: \"kubernetes.io/projected/623d8865-4f62-459b-929b-d12c6978284a-kube-api-access-b72wn\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:38.448189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-certificates\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.448189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447897 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhlr\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-kube-api-access-9dhlr\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.448189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-bound-sa-token\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.448189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.447958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.448189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.448073 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-ca-trust-extracted\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.448583 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.448502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-certificates\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.448636 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.448595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-trusted-ca\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.451598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.451578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-image-registry-private-configuration\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.451737 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.451578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-installation-pull-secrets\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.469596 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.469520 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-bound-sa-token\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.469712 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.469613 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhlr\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-kube-api-access-9dhlr\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.548829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.548795 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3d7992a-df2f-42c4-b112-16554731e7e3-config-volume\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.548829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.548828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3d7992a-df2f-42c4-b112-16554731e7e3-tmp-dir\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.549056 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.548879 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z749z\" (UniqueName: \"kubernetes.io/projected/d3d7992a-df2f-42c4-b112-16554731e7e3-kube-api-access-z749z\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.549056 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.548911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:38.549056 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.548927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b72wn\" (UniqueName: \"kubernetes.io/projected/623d8865-4f62-459b-929b-d12c6978284a-kube-api-access-b72wn\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:38.549056 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.548954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.549056 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.549040 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:38.549269 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.549092 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls podName:d3d7992a-df2f-42c4-b112-16554731e7e3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:39.049076624 +0000 UTC m=+34.628207364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls") pod "dns-default-v9vzq" (UID: "d3d7992a-df2f-42c4-b112-16554731e7e3") : secret "dns-default-metrics-tls" not found Apr 17 11:16:38.549269 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.549040 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:38.549269 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.549182 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert podName:623d8865-4f62-459b-929b-d12c6978284a nodeName:}" failed. No retries permitted until 2026-04-17 11:16:39.049160188 +0000 UTC m=+34.628290919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert") pod "ingress-canary-jv8wn" (UID: "623d8865-4f62-459b-929b-d12c6978284a") : secret "canary-serving-cert" not found Apr 17 11:16:38.549269 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.549255 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3d7992a-df2f-42c4-b112-16554731e7e3-tmp-dir\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.549452 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.549410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3d7992a-df2f-42c4-b112-16554731e7e3-config-volume\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.561939 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.561911 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z749z\" (UniqueName: \"kubernetes.io/projected/d3d7992a-df2f-42c4-b112-16554731e7e3-kube-api-access-z749z\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:38.562080 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.562000 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b72wn\" (UniqueName: \"kubernetes.io/projected/623d8865-4f62-459b-929b-d12c6978284a-kube-api-access-b72wn\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:38.749930 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.749831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:38.749930 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.749913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:38.750156 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.750020 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:38.750156 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.750022 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:38.750156 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.750114 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:17:10.75009265 +0000 UTC m=+66.329223388 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:38.750156 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.750033 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:38.750156 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.750132 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7m79h for pod openshift-network-diagnostics/network-check-target-cxlg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:38.750441 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.750183 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h podName:26dbb6ee-4d95-468d-aafe-1bc2e96c41f1 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:10.750170107 +0000 UTC m=+66.329300856 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m79h" (UniqueName: "kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h") pod "network-check-target-cxlg6" (UID: "26dbb6ee-4d95-468d-aafe-1bc2e96c41f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:38.951183 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.951143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:38.951329 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.951301 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:38.951329 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.951321 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8b9d5b99-7ms26: secret "image-registry-tls" not found Apr 17 11:16:38.951449 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:38.951400 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls podName:bde7b410-7e1b-49fd-9dc0-496cc0d7662c nodeName:}" failed. No retries permitted until 2026-04-17 11:16:39.951383686 +0000 UTC m=+35.530514416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls") pod "image-registry-8b9d5b99-7ms26" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c") : secret "image-registry-tls" not found Apr 17 11:16:38.980206 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.980170 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:16:38.983414 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.983383 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:16:38.983573 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.983437 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dlffp\"" Apr 17 11:16:38.983759 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:38.983742 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:16:39.051895 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.051851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:39.052064 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.051905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:39.052064 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.052006 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:39.052064 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.052012 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:39.052064 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.052061 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls podName:d3d7992a-df2f-42c4-b112-16554731e7e3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:40.05204671 +0000 UTC m=+35.631177434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls") pod "dns-default-v9vzq" (UID: "d3d7992a-df2f-42c4-b112-16554731e7e3") : secret "dns-default-metrics-tls" not found Apr 17 11:16:39.052201 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.052075 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert podName:623d8865-4f62-459b-929b-d12c6978284a nodeName:}" failed. No retries permitted until 2026-04-17 11:16:40.05206969 +0000 UTC m=+35.631200415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert") pod "ingress-canary-jv8wn" (UID: "623d8865-4f62-459b-929b-d12c6978284a") : secret "canary-serving-cert" not found Apr 17 11:16:39.121554 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.121515 2579 generic.go:358] "Generic (PLEG): container finished" podID="282d56ec-09a3-4c2e-a098-e8271c1f2147" containerID="d37961c43bb1293218e7d92d1df0030883ef72f0f87a774465b901980c30d2d0" exitCode=0 Apr 17 11:16:39.122108 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.121566 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gg86" event={"ID":"282d56ec-09a3-4c2e-a098-e8271c1f2147","Type":"ContainerDied","Data":"d37961c43bb1293218e7d92d1df0030883ef72f0f87a774465b901980c30d2d0"} Apr 17 11:16:39.757554 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.757519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:39.757730 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.757681 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:39.757804 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.757792 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret podName:e9ac73cd-3758-404e-bd44-1926d9b9ac58 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:11.757776586 +0000 UTC m=+67.336907315 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret") pod "global-pull-secret-syncer-dhjrk" (UID: "e9ac73cd-3758-404e-bd44-1926d9b9ac58") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:39.959302 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.959254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:39.959542 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.959415 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:39.959542 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.959431 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8b9d5b99-7ms26: secret "image-registry-tls" not found Apr 17 11:16:39.959542 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:39.959501 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls podName:bde7b410-7e1b-49fd-9dc0-496cc0d7662c nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.959480642 +0000 UTC m=+37.538611375 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls") pod "image-registry-8b9d5b99-7ms26" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c") : secret "image-registry-tls" not found Apr 17 11:16:39.980520 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.980477 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:16:39.980674 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.980477 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:16:39.982625 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.982595 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:16:39.982776 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.982665 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q4j6t\"" Apr 17 11:16:39.982776 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:39.982695 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:16:40.060244 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:40.060148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:40.060244 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:40.060197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:40.060496 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:40.060296 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:40.060496 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:40.060304 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:40.060496 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:40.060372 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls podName:d3d7992a-df2f-42c4-b112-16554731e7e3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.060338122 +0000 UTC m=+37.639468866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls") pod "dns-default-v9vzq" (UID: "d3d7992a-df2f-42c4-b112-16554731e7e3") : secret "dns-default-metrics-tls" not found Apr 17 11:16:40.060496 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:40.060387 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert podName:623d8865-4f62-459b-929b-d12c6978284a nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.06038039 +0000 UTC m=+37.639511114 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert") pod "ingress-canary-jv8wn" (UID: "623d8865-4f62-459b-929b-d12c6978284a") : secret "canary-serving-cert" not found Apr 17 11:16:40.126879 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:40.126838 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gg86" event={"ID":"282d56ec-09a3-4c2e-a098-e8271c1f2147","Type":"ContainerStarted","Data":"ea0083d0f8952ac3a7237a3269f5e3362399c79e2f383300e99fbbb6efd772f3"} Apr 17 11:16:40.148544 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:40.148489 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6gg86" podStartSLOduration=5.188822027 podStartE2EDuration="35.148472951s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:16:07.68027026 +0000 UTC m=+3.259400998" lastFinishedPulling="2026-04-17 11:16:37.639921196 +0000 UTC m=+33.219051922" observedRunningTime="2026-04-17 11:16:40.148101504 +0000 UTC m=+35.727232252" watchObservedRunningTime="2026-04-17 11:16:40.148472951 +0000 UTC m=+35.727603698" Apr 17 11:16:41.977331 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:41.977139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:41.977725 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:41.977284 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:41.977725 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:41.977405 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8b9d5b99-7ms26: secret "image-registry-tls" not found Apr 17 11:16:41.977725 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:41.977460 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls podName:bde7b410-7e1b-49fd-9dc0-496cc0d7662c nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.977442859 +0000 UTC m=+41.556573583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls") pod "image-registry-8b9d5b99-7ms26" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c") : secret "image-registry-tls" not found Apr 17 11:16:42.077953 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:42.077910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:42.078126 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:42.077964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:42.078126 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:42.078071 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:42.078126 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:42.078107 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:42.078226 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:42.078140 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert podName:623d8865-4f62-459b-929b-d12c6978284a nodeName:}" failed. No retries permitted until 2026-04-17 11:16:46.078123863 +0000 UTC m=+41.657254592 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert") pod "ingress-canary-jv8wn" (UID: "623d8865-4f62-459b-929b-d12c6978284a") : secret "canary-serving-cert" not found Apr 17 11:16:42.078226 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:42.078155 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls podName:d3d7992a-df2f-42c4-b112-16554731e7e3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:46.078148578 +0000 UTC m=+41.657279303 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls") pod "dns-default-v9vzq" (UID: "d3d7992a-df2f-42c4-b112-16554731e7e3") : secret "dns-default-metrics-tls" not found Apr 17 11:16:46.008083 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:46.008028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:46.008508 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:46.008184 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:46.008508 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:46.008200 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8b9d5b99-7ms26: secret "image-registry-tls" not found Apr 17 11:16:46.008508 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:46.008255 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls podName:bde7b410-7e1b-49fd-9dc0-496cc0d7662c nodeName:}" failed. No retries permitted until 2026-04-17 11:16:54.008237596 +0000 UTC m=+49.587368338 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls") pod "image-registry-8b9d5b99-7ms26" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c") : secret "image-registry-tls" not found Apr 17 11:16:46.108620 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:46.108570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:46.108793 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:46.108703 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:46.108793 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:46.108728 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:46.108793 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:46.108780 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:46.108890 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:46.108802 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls podName:d3d7992a-df2f-42c4-b112-16554731e7e3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:54.10878016 +0000 UTC m=+49.687910904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls") pod "dns-default-v9vzq" (UID: "d3d7992a-df2f-42c4-b112-16554731e7e3") : secret "dns-default-metrics-tls" not found Apr 17 11:16:46.108890 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:46.108819 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert podName:623d8865-4f62-459b-929b-d12c6978284a nodeName:}" failed. No retries permitted until 2026-04-17 11:16:54.108810436 +0000 UTC m=+49.687941176 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert") pod "ingress-canary-jv8wn" (UID: "623d8865-4f62-459b-929b-d12c6978284a") : secret "canary-serving-cert" not found Apr 17 11:16:54.076615 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:54.076569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:16:54.077138 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:54.076720 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:54.077138 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:54.076741 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8b9d5b99-7ms26: secret "image-registry-tls" not found Apr 17 11:16:54.077138 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:54.076797 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls podName:bde7b410-7e1b-49fd-9dc0-496cc0d7662c nodeName:}" failed. No retries permitted until 2026-04-17 11:17:10.076782328 +0000 UTC m=+65.655913056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls") pod "image-registry-8b9d5b99-7ms26" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c") : secret "image-registry-tls" not found Apr 17 11:16:54.177772 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:54.177737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:16:54.177965 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:16:54.177788 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:16:54.177965 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:54.177884 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:54.177965 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:54.177892 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:54.177965 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:54.177962 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert podName:623d8865-4f62-459b-929b-d12c6978284a nodeName:}" failed. No retries permitted until 2026-04-17 11:17:10.177943059 +0000 UTC m=+65.757073789 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert") pod "ingress-canary-jv8wn" (UID: "623d8865-4f62-459b-929b-d12c6978284a") : secret "canary-serving-cert" not found Apr 17 11:16:54.178128 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:16:54.177982 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls podName:d3d7992a-df2f-42c4-b112-16554731e7e3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:10.177972489 +0000 UTC m=+65.757103215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls") pod "dns-default-v9vzq" (UID: "d3d7992a-df2f-42c4-b112-16554731e7e3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:05.118852 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:05.118825 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8dz7" Apr 17 11:17:10.089820 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.089783 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:17:10.090209 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.089938 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:17:10.090209 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.089959 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8b9d5b99-7ms26: secret "image-registry-tls" not found Apr 17 11:17:10.090209 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.090017 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls podName:bde7b410-7e1b-49fd-9dc0-496cc0d7662c nodeName:}" failed. No retries permitted until 2026-04-17 11:17:42.090000579 +0000 UTC m=+97.669131304 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls") pod "image-registry-8b9d5b99-7ms26" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c") : secret "image-registry-tls" not found Apr 17 11:17:10.190168 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.190141 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:17:10.190320 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.190180 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:17:10.190320 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.190284 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:10.190424 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.190339 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert podName:623d8865-4f62-459b-929b-d12c6978284a nodeName:}" failed. No retries permitted until 2026-04-17 11:17:42.190326746 +0000 UTC m=+97.769457476 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert") pod "ingress-canary-jv8wn" (UID: "623d8865-4f62-459b-929b-d12c6978284a") : secret "canary-serving-cert" not found Apr 17 11:17:10.190424 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.190411 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:10.190502 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.190467 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls podName:d3d7992a-df2f-42c4-b112-16554731e7e3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:42.19045476 +0000 UTC m=+97.769585486 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls") pod "dns-default-v9vzq" (UID: "d3d7992a-df2f-42c4-b112-16554731e7e3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:10.794166 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.794130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:17:10.794374 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.794186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:17:10.796440 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.796412 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:10.796499 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.796463 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:10.805057 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.805033 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:10.805148 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:10.805111 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:18:14.805096394 +0000 UTC m=+130.384227120 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : secret "metrics-daemon-secret" not found Apr 17 11:17:10.806917 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.806895 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:10.819070 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:10.819045 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m79h\" (UniqueName: \"kubernetes.io/projected/26dbb6ee-4d95-468d-aafe-1bc2e96c41f1-kube-api-access-7m79h\") pod \"network-check-target-cxlg6\" (UID: \"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1\") " pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:17:11.093224 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:11.093142 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dlffp\"" Apr 17 11:17:11.101784 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:11.101758 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:17:11.228570 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:11.228532 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cxlg6"] Apr 17 11:17:11.232668 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:17:11.232642 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26dbb6ee_4d95_468d_aafe_1bc2e96c41f1.slice/crio-9a5f481bac7b2265a1a79f68037d74c9b1c831263c67838927e73fadf8533403 WatchSource:0}: Error finding container 9a5f481bac7b2265a1a79f68037d74c9b1c831263c67838927e73fadf8533403: Status 404 returned error can't find the container with id 9a5f481bac7b2265a1a79f68037d74c9b1c831263c67838927e73fadf8533403 Apr 17 11:17:11.801284 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:11.801252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:17:11.803657 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:11.803631 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:17:11.813896 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:11.813875 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9ac73cd-3758-404e-bd44-1926d9b9ac58-original-pull-secret\") pod \"global-pull-secret-syncer-dhjrk\" (UID: \"e9ac73cd-3758-404e-bd44-1926d9b9ac58\") " pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:17:12.091612 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:12.091528 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhjrk" Apr 17 11:17:12.188689 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:12.188650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cxlg6" event={"ID":"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1","Type":"ContainerStarted","Data":"9a5f481bac7b2265a1a79f68037d74c9b1c831263c67838927e73fadf8533403"} Apr 17 11:17:12.228651 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:12.228620 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dhjrk"] Apr 17 11:17:12.232819 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:17:12.232791 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9ac73cd_3758_404e_bd44_1926d9b9ac58.slice/crio-33a857bf7880d07156370ddb3fd1ac64e63885b6500a1565797612bbce264bf6 WatchSource:0}: Error finding container 33a857bf7880d07156370ddb3fd1ac64e63885b6500a1565797612bbce264bf6: Status 404 returned error can't find the container with id 33a857bf7880d07156370ddb3fd1ac64e63885b6500a1565797612bbce264bf6 Apr 17 11:17:13.191812 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:13.191777 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dhjrk" event={"ID":"e9ac73cd-3758-404e-bd44-1926d9b9ac58","Type":"ContainerStarted","Data":"33a857bf7880d07156370ddb3fd1ac64e63885b6500a1565797612bbce264bf6"} Apr 17 11:17:14.195240 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:14.195202 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cxlg6" event={"ID":"26dbb6ee-4d95-468d-aafe-1bc2e96c41f1","Type":"ContainerStarted","Data":"4c336136d8f5ad9512d79a44e3c8381377d7f1be132de4a170c9393ff486de3c"} Apr 17 11:17:14.195708 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:14.195471 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:17:14.996210 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:14.996159 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cxlg6" podStartSLOduration=67.190642946 podStartE2EDuration="1m9.996142441s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:17:11.235012019 +0000 UTC m=+66.814142747" lastFinishedPulling="2026-04-17 11:17:14.040511504 +0000 UTC m=+69.619642242" observedRunningTime="2026-04-17 11:17:14.21899275 +0000 UTC m=+69.798123497" watchObservedRunningTime="2026-04-17 11:17:14.996142441 +0000 UTC m=+70.575273191" Apr 17 11:17:16.200321 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:16.200238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dhjrk" event={"ID":"e9ac73cd-3758-404e-bd44-1926d9b9ac58","Type":"ContainerStarted","Data":"47243b27f556c30354e055d9a6300909d1aa7533f2f99fe81855bc8727d7f279"} Apr 17 11:17:16.218391 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:16.218337 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dhjrk" podStartSLOduration=65.608173545 podStartE2EDuration="1m9.218323445s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:17:12.235416813 +0000 UTC m=+67.814547539" lastFinishedPulling="2026-04-17 11:17:15.845566697 +0000 UTC m=+71.424697439" observedRunningTime="2026-04-17 11:17:16.217431517 +0000 UTC m=+71.796562275" watchObservedRunningTime="2026-04-17 11:17:16.218323445 +0000 UTC m=+71.797454192" Apr 17 11:17:42.115801 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:42.115641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:17:42.116211 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:42.115904 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:17:42.116211 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:42.115930 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8b9d5b99-7ms26: secret "image-registry-tls" not found Apr 17 11:17:42.116211 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:42.116014 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls podName:bde7b410-7e1b-49fd-9dc0-496cc0d7662c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.11599067 +0000 UTC m=+161.695121414 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls") pod "image-registry-8b9d5b99-7ms26" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c") : secret "image-registry-tls" not found Apr 17 11:17:42.216201 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:42.216160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:17:42.216201 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:42.216205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:17:42.216437 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:42.216291 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:42.216437 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:42.216305 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:42.216437 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:42.216399 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert podName:623d8865-4f62-459b-929b-d12c6978284a nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.216376811 +0000 UTC m=+161.795507538 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert") pod "ingress-canary-jv8wn" (UID: "623d8865-4f62-459b-929b-d12c6978284a") : secret "canary-serving-cert" not found Apr 17 11:17:42.216534 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:17:42.216441 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls podName:d3d7992a-df2f-42c4-b112-16554731e7e3 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.216428137 +0000 UTC m=+161.795558862 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls") pod "dns-default-v9vzq" (UID: "d3d7992a-df2f-42c4-b112-16554731e7e3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:45.200195 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:17:45.200156 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cxlg6" Apr 17 11:18:14.858152 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:14.858103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:18:14.858675 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:14.858262 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:14.858675 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:14.858330 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs podName:00d87433-8bc9-4d18-bab8-e6f889a4b52d nodeName:}" failed. No retries permitted until 2026-04-17 11:20:16.858309909 +0000 UTC m=+252.437440864 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs") pod "network-metrics-daemon-n2dw9" (UID: "00d87433-8bc9-4d18-bab8-e6f889a4b52d") : secret "metrics-daemon-secret" not found Apr 17 11:18:29.077455 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.077420 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b"] Apr 17 11:18:29.080233 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.080217 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.082450 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.082426 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:29.083646 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.083614 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6"] Apr 17 11:18:29.086262 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.086242 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8"] Apr 17 11:18:29.086414 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.086397 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6" Apr 17 11:18:29.087591 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.087565 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 11:18:29.087682 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.087565 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 11:18:29.087682 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.087564 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-b7hth\"" Apr 17 11:18:29.087682 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.087570 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:29.088510 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.088497 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:29.088972 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.088956 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-764bf577f6-vwbj5"] Apr 17 11:18:29.089094 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.089080 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.089510 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.089488 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:29.089610 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.089558 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-rlp6r\"" Apr 17 11:18:29.091513 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.091495 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:18:29.091593 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.091565 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5mnr6\"" Apr 17 11:18:29.091681 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.091664 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4"] Apr 17 11:18:29.091811 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.091795 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.093488 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.093467 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 11:18:29.093988 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.093965 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 11:18:29.093988 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.093983 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:18:29.094133 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.093966 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 11:18:29.094329 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.094310 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 11:18:29.094805 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.094408 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:29.095551 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.095465 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 11:18:29.095891 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.095869 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b"] Apr 17 11:18:29.096188 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.096168 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 11:18:29.097223 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.097185 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6"] Apr 17 11:18:29.097834 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.097675 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 11:18:29.097834 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.097739 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 11:18:29.097834 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.097756 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6pxcg\"" Apr 17 11:18:29.098089 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.098032 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:29.098148 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.098102 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:29.098268 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.098243 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bjzrm\"" Apr 17 11:18:29.098268 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.098277 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8"] Apr 17 11:18:29.098622 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.098605 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 11:18:29.100031 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.100014 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-764bf577f6-vwbj5"] Apr 17 11:18:29.120261 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.120230 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4"] Apr 17 11:18:29.160714 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160678 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-stats-auth\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.160714 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160719 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.160914 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160739 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-config\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.160914 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:29.160914 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160865 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.160914 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdls\" (UniqueName: \"kubernetes.io/projected/c0ab458a-5b75-403d-9929-6914742dd815-kube-api-access-hrdls\") pod \"volume-data-source-validator-7c6cbb6c87-bhjx6\" (UID: \"c0ab458a-5b75-403d-9929-6914742dd815\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6" Apr 17 11:18:29.161052 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-default-certificate\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.161052 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrmv9\" (UniqueName: \"kubernetes.io/projected/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-kube-api-access-nrmv9\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.161052 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.160975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.161149 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.161046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.161149 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.161080 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbq7h\" (UniqueName: \"kubernetes.io/projected/50377741-7ad3-48c1-aea2-6c86f69d3a25-kube-api-access-nbq7h\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:29.161149 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.161122 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmlcd\" (UniqueName: \"kubernetes.io/projected/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-kube-api-access-rmlcd\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.161258 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.161181 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3746d925-d855-4fc1-b6a3-cb7afc7b2395-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.161258 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.161226 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw47q\" (UniqueName: \"kubernetes.io/projected/3746d925-d855-4fc1-b6a3-cb7afc7b2395-kube-api-access-hw47q\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.262365 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-stats-auth\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.262527 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.262527 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-config\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.262527 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:29.262527 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.262527 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdls\" (UniqueName: \"kubernetes.io/projected/c0ab458a-5b75-403d-9929-6914742dd815-kube-api-access-hrdls\") pod \"volume-data-source-validator-7c6cbb6c87-bhjx6\" (UID: \"c0ab458a-5b75-403d-9929-6914742dd815\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6" Apr 17 11:18:29.262527 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-default-certificate\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.262826 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrmv9\" (UniqueName: \"kubernetes.io/projected/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-kube-api-access-nrmv9\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.262826 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.262570 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:29.262826 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.262826 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.262622 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:29.762598253 +0000 UTC m=+145.341728998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:29.262826 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.262699 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls podName:50377741-7ad3-48c1-aea2-6c86f69d3a25 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:29.762666777 +0000 UTC m=+145.341797517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xh8b4" (UID: "50377741-7ad3-48c1-aea2-6c86f69d3a25") : secret "samples-operator-tls" not found Apr 17 11:18:29.262826 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.262826 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbq7h\" (UniqueName: \"kubernetes.io/projected/50377741-7ad3-48c1-aea2-6c86f69d3a25-kube-api-access-nbq7h\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:29.263171 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.262840 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:29.263171 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmlcd\" (UniqueName: \"kubernetes.io/projected/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-kube-api-access-rmlcd\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.263171 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.262884 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls podName:3746d925-d855-4fc1-b6a3-cb7afc7b2395 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:29.762870606 +0000 UTC m=+145.342001332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zgdf8" (UID: "3746d925-d855-4fc1-b6a3-cb7afc7b2395") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:29.263171 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262925 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3746d925-d855-4fc1-b6a3-cb7afc7b2395-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.263171 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.262958 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw47q\" (UniqueName: \"kubernetes.io/projected/3746d925-d855-4fc1-b6a3-cb7afc7b2395-kube-api-access-hw47q\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.263171 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.263022 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:29.263171 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.263085 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-config\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.263171 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.263148 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:29.763129592 +0000 UTC m=+145.342260319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : secret "router-metrics-certs-default" not found Apr 17 11:18:29.263842 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.263815 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3746d925-d855-4fc1-b6a3-cb7afc7b2395-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.265076 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.265049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-stats-auth\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.265176 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.265086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.265176 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.265150 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-default-certificate\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.276394 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.276348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdls\" (UniqueName: \"kubernetes.io/projected/c0ab458a-5b75-403d-9929-6914742dd815-kube-api-access-hrdls\") pod \"volume-data-source-validator-7c6cbb6c87-bhjx6\" (UID: \"c0ab458a-5b75-403d-9929-6914742dd815\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6" Apr 17 11:18:29.277708 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.277688 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbq7h\" (UniqueName: \"kubernetes.io/projected/50377741-7ad3-48c1-aea2-6c86f69d3a25-kube-api-access-nbq7h\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:29.277925 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.277887 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmlcd\" (UniqueName: \"kubernetes.io/projected/1b1e2d47-0361-4bc8-bc22-76c310dda1e0-kube-api-access-rmlcd\") pod \"service-ca-operator-d6fc45fc5-ws96b\" (UID: \"1b1e2d47-0361-4bc8-bc22-76c310dda1e0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.278750 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.278730 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw47q\" (UniqueName: \"kubernetes.io/projected/3746d925-d855-4fc1-b6a3-cb7afc7b2395-kube-api-access-hw47q\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.278896 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.278878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrmv9\" (UniqueName: \"kubernetes.io/projected/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-kube-api-access-nrmv9\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.390853 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.390757 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" Apr 17 11:18:29.399532 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.399501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6" Apr 17 11:18:29.524183 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.524142 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b"] Apr 17 11:18:29.528164 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:18:29.528137 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1e2d47_0361_4bc8_bc22_76c310dda1e0.slice/crio-6d427b398a618562d596566b86c27adb2fffc4e7df42e00c0396c9367707df30 WatchSource:0}: Error finding container 6d427b398a618562d596566b86c27adb2fffc4e7df42e00c0396c9367707df30: Status 404 returned error can't find the container with id 6d427b398a618562d596566b86c27adb2fffc4e7df42e00c0396c9367707df30 Apr 17 11:18:29.535180 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.535159 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6"] Apr 17 11:18:29.539507 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:18:29.539481 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ab458a_5b75_403d_9929_6914742dd815.slice/crio-b7806a6e7214c6df1f76696d5fb942ef1ecaaf5f612763ff51df6b808eb7fc65 WatchSource:0}: Error finding container b7806a6e7214c6df1f76696d5fb942ef1ecaaf5f612763ff51df6b808eb7fc65: Status 404 returned error can't find the container with id b7806a6e7214c6df1f76696d5fb942ef1ecaaf5f612763ff51df6b808eb7fc65 Apr 17 11:18:29.767114 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.767085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.767310 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.767122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:29.767310 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.767169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:29.767310 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:29.767188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:29.767310 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.767258 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:29.767310 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.767294 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:29.767310 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.767298 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:29.767576 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.767342 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:30.767319691 +0000 UTC m=+146.346450429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : secret "router-metrics-certs-default" not found Apr 17 11:18:29.767576 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.767388 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls podName:50377741-7ad3-48c1-aea2-6c86f69d3a25 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:30.767376168 +0000 UTC m=+146.346506893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xh8b4" (UID: "50377741-7ad3-48c1-aea2-6c86f69d3a25") : secret "samples-operator-tls" not found Apr 17 11:18:29.767576 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.767404 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:30.767395691 +0000 UTC m=+146.346526422 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:29.767576 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:29.767423 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls podName:3746d925-d855-4fc1-b6a3-cb7afc7b2395 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:30.767414814 +0000 UTC m=+146.346545540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zgdf8" (UID: "3746d925-d855-4fc1-b6a3-cb7afc7b2395") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:30.346870 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:30.346830 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6" event={"ID":"c0ab458a-5b75-403d-9929-6914742dd815","Type":"ContainerStarted","Data":"b7806a6e7214c6df1f76696d5fb942ef1ecaaf5f612763ff51df6b808eb7fc65"} Apr 17 11:18:30.348198 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:30.348170 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" event={"ID":"1b1e2d47-0361-4bc8-bc22-76c310dda1e0","Type":"ContainerStarted","Data":"6d427b398a618562d596566b86c27adb2fffc4e7df42e00c0396c9367707df30"} Apr 17 11:18:30.776740 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:30.776702 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:30.776898 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:30.776763 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:30.776898 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:30.776821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:30.776898 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:30.776857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:30.776898 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:30.776882 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:32.77685774 +0000 UTC m=+148.355988466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:30.777096 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:30.776913 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:30.777096 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:30.776954 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls podName:50377741-7ad3-48c1-aea2-6c86f69d3a25 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:32.776942436 +0000 UTC m=+148.356073178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xh8b4" (UID: "50377741-7ad3-48c1-aea2-6c86f69d3a25") : secret "samples-operator-tls" not found Apr 17 11:18:30.777096 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:30.776954 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:30.777096 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:30.776995 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:30.777096 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:30.777006 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls podName:3746d925-d855-4fc1-b6a3-cb7afc7b2395 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:32.776994541 +0000 UTC m=+148.356125266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zgdf8" (UID: "3746d925-d855-4fc1-b6a3-cb7afc7b2395") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:30.777096 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:30.777063 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:32.777047791 +0000 UTC m=+148.356178522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : secret "router-metrics-certs-default" not found Apr 17 11:18:31.351789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:31.351748 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" event={"ID":"1b1e2d47-0361-4bc8-bc22-76c310dda1e0","Type":"ContainerStarted","Data":"cf6b294eb66588735274f99e6532874c8511adab874a3e728e9be2025e60363b"} Apr 17 11:18:31.353085 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:31.353059 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6" event={"ID":"c0ab458a-5b75-403d-9929-6914742dd815","Type":"ContainerStarted","Data":"3b14999024cce854d3342655ac60ff1fd34ca92220b98f7a3bb3fc0930d93918"} Apr 17 11:18:31.366442 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:31.366392 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" podStartSLOduration=0.666974788 podStartE2EDuration="2.366380487s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="2026-04-17 11:18:29.530060341 +0000 UTC m=+145.109191066" lastFinishedPulling="2026-04-17 11:18:31.229466035 +0000 UTC m=+146.808596765" observedRunningTime="2026-04-17 11:18:31.36542146 +0000 UTC m=+146.944552208" watchObservedRunningTime="2026-04-17 11:18:31.366380487 +0000 UTC m=+146.945511233" Apr 17 11:18:31.379158 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:31.379109 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bhjx6" podStartSLOduration=0.735895253 podStartE2EDuration="2.379090484s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="2026-04-17 11:18:29.541227455 +0000 UTC m=+145.120358181" lastFinishedPulling="2026-04-17 11:18:31.184422685 +0000 UTC m=+146.763553412" observedRunningTime="2026-04-17 11:18:31.37849504 +0000 UTC m=+146.957625788" watchObservedRunningTime="2026-04-17 11:18:31.379090484 +0000 UTC m=+146.958221232" Apr 17 11:18:32.794996 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:32.794950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:32.795004 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:32.795073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:32.795094 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:32.795108 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:32.795162 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:32.795166 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:36.795151063 +0000 UTC m=+152.374281788 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : secret "router-metrics-certs-default" not found Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:32.795246 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:32.795252 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:36.795241563 +0000 UTC m=+152.374372289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:32.795265 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls podName:3746d925-d855-4fc1-b6a3-cb7afc7b2395 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:36.795259884 +0000 UTC m=+152.374390610 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zgdf8" (UID: "3746d925-d855-4fc1-b6a3-cb7afc7b2395") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:32.795499 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:32.795295 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls podName:50377741-7ad3-48c1-aea2-6c86f69d3a25 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:36.795280676 +0000 UTC m=+152.374411403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xh8b4" (UID: "50377741-7ad3-48c1-aea2-6c86f69d3a25") : secret "samples-operator-tls" not found Apr 17 11:18:34.345167 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.345132 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x"] Apr 17 11:18:34.348170 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.348154 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" Apr 17 11:18:34.351249 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.351226 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 11:18:34.353627 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.353603 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:34.353936 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.353918 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-l7r52\"" Apr 17 11:18:34.364262 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.364241 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x"] Apr 17 11:18:34.408163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.408131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6rh\" (UniqueName: \"kubernetes.io/projected/7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0-kube-api-access-qz6rh\") pod \"migrator-74bb7799d9-jkm5x\" (UID: \"7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" Apr 17 11:18:34.508685 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.508647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6rh\" (UniqueName: \"kubernetes.io/projected/7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0-kube-api-access-qz6rh\") pod \"migrator-74bb7799d9-jkm5x\" (UID: \"7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" Apr 17 11:18:34.517468 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.517443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6rh\" (UniqueName: \"kubernetes.io/projected/7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0-kube-api-access-qz6rh\") pod \"migrator-74bb7799d9-jkm5x\" (UID: \"7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" Apr 17 11:18:34.656685 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.656584 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" Apr 17 11:18:34.804505 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:34.804473 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x"] Apr 17 11:18:34.807879 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:18:34.807851 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cfd83d0_4383_41e1_9dc0_7aedbd04ddb0.slice/crio-b21b59e1612612ad67986e3b96331363264616586362e88effe21a4c0828b39f WatchSource:0}: Error finding container b21b59e1612612ad67986e3b96331363264616586362e88effe21a4c0828b39f: Status 404 returned error can't find the container with id b21b59e1612612ad67986e3b96331363264616586362e88effe21a4c0828b39f Apr 17 11:18:35.362241 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:35.362206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" event={"ID":"7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0","Type":"ContainerStarted","Data":"b21b59e1612612ad67986e3b96331363264616586362e88effe21a4c0828b39f"} Apr 17 11:18:35.988647 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:35.988624 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w2vns_498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf/dns-node-resolver/0.log" Apr 17 11:18:36.366610 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:36.366569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" event={"ID":"7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0","Type":"ContainerStarted","Data":"1db95105708ff8018aa795f63e283c2cc3383b7370182afd482d7bf5b8d9262a"} Apr 17 11:18:36.366610 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:36.366615 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" event={"ID":"7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0","Type":"ContainerStarted","Data":"ee8cdc9c3fd4262d174e0eeb388b27173a9705099f2c6052aa9fb4657f476276"} Apr 17 11:18:36.826370 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:36.826324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:36.826520 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:36.826383 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:36.826520 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:36.826438 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:36.826520 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:36.826459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:36.826520 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:36.826474 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:36.826729 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:36.826541 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:36.826729 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:36.826552 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.826531244 +0000 UTC m=+160.405661982 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : secret "router-metrics-certs-default" not found Apr 17 11:18:36.826729 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:36.826557 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:36.826729 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:36.826580 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle podName:88f36c9d-2a1c-4fa7-b48b-0bcca001c665 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.826564128 +0000 UTC m=+160.405694858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle") pod "router-default-764bf577f6-vwbj5" (UID: "88f36c9d-2a1c-4fa7-b48b-0bcca001c665") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:36.826729 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:36.826600 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls podName:3746d925-d855-4fc1-b6a3-cb7afc7b2395 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.826593808 +0000 UTC m=+160.405724533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zgdf8" (UID: "3746d925-d855-4fc1-b6a3-cb7afc7b2395") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:36.826729 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:36.826610 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls podName:50377741-7ad3-48c1-aea2-6c86f69d3a25 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.826605003 +0000 UTC m=+160.405735728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xh8b4" (UID: "50377741-7ad3-48c1-aea2-6c86f69d3a25") : secret "samples-operator-tls" not found Apr 17 11:18:36.987464 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:36.987437 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-szsmw_6e92a775-6760-473c-a919-b7e7bcf242c5/node-ca/0.log" Apr 17 11:18:41.276097 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:41.276052 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" podUID="bde7b410-7e1b-49fd-9dc0-496cc0d7662c" Apr 17 11:18:41.307243 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:41.307200 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jv8wn" podUID="623d8865-4f62-459b-929b-d12c6978284a" Apr 17 11:18:41.326448 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:41.326395 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-v9vzq" podUID="d3d7992a-df2f-42c4-b112-16554731e7e3" Apr 17 11:18:41.378147 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:41.378116 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:18:41.378147 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:41.378142 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:18:42.996417 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:42.996347 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-n2dw9" podUID="00d87433-8bc9-4d18-bab8-e6f889a4b52d" Apr 17 11:18:44.893626 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:44.893584 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:44.893626 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:44.893633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:44.894077 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:44.893687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:44.894077 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:44.893711 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:18:44.894077 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:44.893814 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:44.894077 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:18:44.893864 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls podName:3746d925-d855-4fc1-b6a3-cb7afc7b2395 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:00.893850653 +0000 UTC m=+176.472981378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zgdf8" (UID: "3746d925-d855-4fc1-b6a3-cb7afc7b2395") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:44.894346 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:44.894320 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-service-ca-bundle\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:44.896244 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:44.896216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50377741-7ad3-48c1-aea2-6c86f69d3a25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xh8b4\" (UID: \"50377741-7ad3-48c1-aea2-6c86f69d3a25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:44.896372 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:44.896216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88f36c9d-2a1c-4fa7-b48b-0bcca001c665-metrics-certs\") pod \"router-default-764bf577f6-vwbj5\" (UID: \"88f36c9d-2a1c-4fa7-b48b-0bcca001c665\") " pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:45.013390 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.013336 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:45.019165 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.019127 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" Apr 17 11:18:45.149402 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.149273 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jkm5x" podStartSLOduration=9.993296081 podStartE2EDuration="11.149255662s" podCreationTimestamp="2026-04-17 11:18:34 +0000 UTC" firstStartedPulling="2026-04-17 11:18:34.809781673 +0000 UTC m=+150.388912398" lastFinishedPulling="2026-04-17 11:18:35.965741254 +0000 UTC m=+151.544871979" observedRunningTime="2026-04-17 11:18:36.383144801 +0000 UTC m=+151.962275550" watchObservedRunningTime="2026-04-17 11:18:45.149255662 +0000 UTC m=+160.728386465" Apr 17 11:18:45.149608 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.149591 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-764bf577f6-vwbj5"] Apr 17 11:18:45.153608 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:18:45.153569 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f36c9d_2a1c_4fa7_b48b_0bcca001c665.slice/crio-4fbf0101414f9533e738d1551130b2e3ceeb1d71a46064cd3d398a10f656c399 WatchSource:0}: Error finding container 4fbf0101414f9533e738d1551130b2e3ceeb1d71a46064cd3d398a10f656c399: Status 404 returned error can't find the container with id 4fbf0101414f9533e738d1551130b2e3ceeb1d71a46064cd3d398a10f656c399 Apr 17 11:18:45.167151 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.167115 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4"] Apr 17 11:18:45.388290 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.388252 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" event={"ID":"50377741-7ad3-48c1-aea2-6c86f69d3a25","Type":"ContainerStarted","Data":"5d34ecf2fc1ecba8a4b37e90727da7511df11ce1a60b11258564d1c3dc37c555"} Apr 17 11:18:45.389503 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.389475 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-764bf577f6-vwbj5" event={"ID":"88f36c9d-2a1c-4fa7-b48b-0bcca001c665","Type":"ContainerStarted","Data":"d8b55dd16d52c7b49d81bf9657e24a500cf6951620f687dee7dbecc8082b3ad4"} Apr 17 11:18:45.389503 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.389508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-764bf577f6-vwbj5" event={"ID":"88f36c9d-2a1c-4fa7-b48b-0bcca001c665","Type":"ContainerStarted","Data":"4fbf0101414f9533e738d1551130b2e3ceeb1d71a46064cd3d398a10f656c399"} Apr 17 11:18:45.409127 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:45.409017 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-764bf577f6-vwbj5" podStartSLOduration=16.408994874 podStartE2EDuration="16.408994874s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:45.408322946 +0000 UTC m=+160.987453694" watchObservedRunningTime="2026-04-17 11:18:45.408994874 +0000 UTC m=+160.988125621" Apr 17 11:18:46.013706 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.013672 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:46.016631 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.016599 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:46.204208 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.204171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:18:46.207317 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.207245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"image-registry-8b9d5b99-7ms26\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:18:46.305104 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.305000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:18:46.305104 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.305059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:18:46.308005 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.307969 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3d7992a-df2f-42c4-b112-16554731e7e3-metrics-tls\") pod \"dns-default-v9vzq\" (UID: \"d3d7992a-df2f-42c4-b112-16554731e7e3\") " pod="openshift-dns/dns-default-v9vzq" Apr 17 11:18:46.308144 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.308046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/623d8865-4f62-459b-929b-d12c6978284a-cert\") pod \"ingress-canary-jv8wn\" (UID: \"623d8865-4f62-459b-929b-d12c6978284a\") " pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:18:46.392993 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.392959 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:46.394524 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.394483 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-764bf577f6-vwbj5" Apr 17 11:18:46.481470 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.481431 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-whkkk\"" Apr 17 11:18:46.481470 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.481459 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-l4t4t\"" Apr 17 11:18:46.489343 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.489306 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:18:46.489538 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.489378 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jv8wn" Apr 17 11:18:46.721660 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.721629 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8b9d5b99-7ms26"] Apr 17 11:18:46.726212 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:18:46.726126 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde7b410_7e1b_49fd_9dc0_496cc0d7662c.slice/crio-097c37c47f698eda816057016fde247179570aeedd0e73c8b3aad783714b2734 WatchSource:0}: Error finding container 097c37c47f698eda816057016fde247179570aeedd0e73c8b3aad783714b2734: Status 404 returned error can't find the container with id 097c37c47f698eda816057016fde247179570aeedd0e73c8b3aad783714b2734 Apr 17 11:18:46.735271 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:46.735227 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jv8wn"] Apr 17 11:18:46.742097 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:18:46.742058 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod623d8865_4f62_459b_929b_d12c6978284a.slice/crio-3823f57fa7b4b3df991b6253896539d89f105612641cf3bcb21661306285c8e2 WatchSource:0}: Error finding container 3823f57fa7b4b3df991b6253896539d89f105612641cf3bcb21661306285c8e2: Status 404 returned error can't find the container with id 3823f57fa7b4b3df991b6253896539d89f105612641cf3bcb21661306285c8e2 Apr 17 11:18:47.400322 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:47.400254 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" event={"ID":"50377741-7ad3-48c1-aea2-6c86f69d3a25","Type":"ContainerStarted","Data":"9435aef9566337cd883a47440b53e049e8bc020ac88b588366cc1623933070c5"} Apr 17 11:18:47.400322 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:47.400298 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" event={"ID":"50377741-7ad3-48c1-aea2-6c86f69d3a25","Type":"ContainerStarted","Data":"e1537d943d4b7a90f05efef3009eb9e9f6ee571266f99496c253f82d660212cc"} Apr 17 11:18:47.402013 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:47.401968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" event={"ID":"bde7b410-7e1b-49fd-9dc0-496cc0d7662c","Type":"ContainerStarted","Data":"70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40"} Apr 17 11:18:47.402013 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:47.402011 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" event={"ID":"bde7b410-7e1b-49fd-9dc0-496cc0d7662c","Type":"ContainerStarted","Data":"097c37c47f698eda816057016fde247179570aeedd0e73c8b3aad783714b2734"} Apr 17 11:18:47.402219 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:47.402094 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:18:47.403347 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:47.403315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jv8wn" event={"ID":"623d8865-4f62-459b-929b-d12c6978284a","Type":"ContainerStarted","Data":"3823f57fa7b4b3df991b6253896539d89f105612641cf3bcb21661306285c8e2"} Apr 17 11:18:47.416531 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:47.416472 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xh8b4" podStartSLOduration=16.988752167 podStartE2EDuration="18.416455041s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="2026-04-17 11:18:45.210179909 +0000 UTC m=+160.789310667" lastFinishedPulling="2026-04-17 11:18:46.637882802 +0000 UTC m=+162.217013541" observedRunningTime="2026-04-17 11:18:47.41539 +0000 UTC m=+162.994520748" watchObservedRunningTime="2026-04-17 11:18:47.416455041 +0000 UTC m=+162.995585789" Apr 17 11:18:47.436216 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:47.436156 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" podStartSLOduration=142.436138837 podStartE2EDuration="2m22.436138837s" podCreationTimestamp="2026-04-17 11:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:47.43561127 +0000 UTC m=+163.014742017" watchObservedRunningTime="2026-04-17 11:18:47.436138837 +0000 UTC m=+163.015269583" Apr 17 11:18:49.409186 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:49.409148 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jv8wn" event={"ID":"623d8865-4f62-459b-929b-d12c6978284a","Type":"ContainerStarted","Data":"87ef71ecd1ca77b62adc05675fda8dbc0ae687a735593a7139f4734cd07ede7b"} Apr 17 11:18:49.424143 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:49.424088 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jv8wn" podStartSLOduration=129.753435464 podStartE2EDuration="2m11.424071256s" podCreationTimestamp="2026-04-17 11:16:38 +0000 UTC" firstStartedPulling="2026-04-17 11:18:46.744419813 +0000 UTC m=+162.323550543" lastFinishedPulling="2026-04-17 11:18:48.415055606 +0000 UTC m=+163.994186335" observedRunningTime="2026-04-17 11:18:49.423477744 +0000 UTC m=+165.002608483" watchObservedRunningTime="2026-04-17 11:18:49.424071256 +0000 UTC m=+165.003202000" Apr 17 11:18:53.980419 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:53.980307 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:18:55.976973 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.976939 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-k6bbx"] Apr 17 11:18:55.982234 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.981837 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v9vzq" Apr 17 11:18:55.982422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.982312 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:55.984835 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.984799 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:18:55.985002 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.984799 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:18:55.985332 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.985301 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvwdc\"" Apr 17 11:18:55.985471 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.985410 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rkd8r\"" Apr 17 11:18:55.985585 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.985561 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:18:55.985683 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.985648 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:18:55.992704 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:55.992680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v9vzq" Apr 17 11:18:56.000195 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.000162 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-k6bbx"] Apr 17 11:18:56.084407 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.084342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-crio-socket\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.084633 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.084412 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.084633 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.084461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-data-volume\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.084633 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.084486 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn8xf\" (UniqueName: \"kubernetes.io/projected/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-kube-api-access-zn8xf\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.084633 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.084525 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.128979 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.128908 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v9vzq"] Apr 17 11:18:56.133775 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:18:56.133742 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3d7992a_df2f_42c4_b112_16554731e7e3.slice/crio-7143718cfa7b81c31cc97bb1fb74c611c82c53cb616bbc90451ebf8929aac07b WatchSource:0}: Error finding container 7143718cfa7b81c31cc97bb1fb74c611c82c53cb616bbc90451ebf8929aac07b: Status 404 returned error can't find the container with id 7143718cfa7b81c31cc97bb1fb74c611c82c53cb616bbc90451ebf8929aac07b Apr 17 11:18:56.184950 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.184909 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-crio-socket\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.184950 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.184951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.185162 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.184983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-data-volume\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.185162 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.185009 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn8xf\" (UniqueName: \"kubernetes.io/projected/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-kube-api-access-zn8xf\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.185162 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.185046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.185162 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.185046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-crio-socket\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.185420 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.185396 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-data-volume\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.185626 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.185607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.187473 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.187449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.196062 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.196032 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn8xf\" (UniqueName: \"kubernetes.io/projected/bbded126-d122-4dc3-b0bd-fd03d3ac38e7-kube-api-access-zn8xf\") pod \"insights-runtime-extractor-k6bbx\" (UID: \"bbded126-d122-4dc3-b0bd-fd03d3ac38e7\") " pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.293676 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.293619 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-k6bbx" Apr 17 11:18:56.428223 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.428184 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-k6bbx"] Apr 17 11:18:56.428864 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:56.428834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v9vzq" event={"ID":"d3d7992a-df2f-42c4-b112-16554731e7e3","Type":"ContainerStarted","Data":"7143718cfa7b81c31cc97bb1fb74c611c82c53cb616bbc90451ebf8929aac07b"} Apr 17 11:18:56.432251 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:18:56.432218 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbded126_d122_4dc3_b0bd_fd03d3ac38e7.slice/crio-02ce2c2608f07ee7e8abfa6d63fea554d683b30d99654f1ba8f9bbdf368dedc0 WatchSource:0}: Error finding container 02ce2c2608f07ee7e8abfa6d63fea554d683b30d99654f1ba8f9bbdf368dedc0: Status 404 returned error can't find the container with id 02ce2c2608f07ee7e8abfa6d63fea554d683b30d99654f1ba8f9bbdf368dedc0 Apr 17 11:18:57.434494 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:57.434455 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-k6bbx" event={"ID":"bbded126-d122-4dc3-b0bd-fd03d3ac38e7","Type":"ContainerStarted","Data":"67985bd2097989ca8d7d90cc201f843dcc0720288de4cb951c1f56bdde0c467d"} Apr 17 11:18:57.434980 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:57.434505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-k6bbx" event={"ID":"bbded126-d122-4dc3-b0bd-fd03d3ac38e7","Type":"ContainerStarted","Data":"02ce2c2608f07ee7e8abfa6d63fea554d683b30d99654f1ba8f9bbdf368dedc0"} Apr 17 11:18:58.440330 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:58.440292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-k6bbx" event={"ID":"bbded126-d122-4dc3-b0bd-fd03d3ac38e7","Type":"ContainerStarted","Data":"af09d5f682260dcee5b4a911162dbfac2abd3332fdb0a63096f90e942c85d958"} Apr 17 11:18:58.442241 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:58.442205 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v9vzq" event={"ID":"d3d7992a-df2f-42c4-b112-16554731e7e3","Type":"ContainerStarted","Data":"54337dba52125f31efa6340afc9b0c392aca9a3807f66f9b0d6d25b43b25e7fd"} Apr 17 11:18:58.442386 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:58.442249 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v9vzq" event={"ID":"d3d7992a-df2f-42c4-b112-16554731e7e3","Type":"ContainerStarted","Data":"319d8ad9bc06f73d579d8c265cf0b7ddf08c37df547c604223c5a3b8faa95777"} Apr 17 11:18:58.442386 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:58.442364 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-v9vzq" Apr 17 11:18:58.459392 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:58.459325 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v9vzq" podStartSLOduration=139.218584582 podStartE2EDuration="2m20.459305978s" podCreationTimestamp="2026-04-17 11:16:38 +0000 UTC" firstStartedPulling="2026-04-17 11:18:56.135662036 +0000 UTC m=+171.714792760" lastFinishedPulling="2026-04-17 11:18:57.376383427 +0000 UTC m=+172.955514156" observedRunningTime="2026-04-17 11:18:58.458494068 +0000 UTC m=+174.037624820" watchObservedRunningTime="2026-04-17 11:18:58.459305978 +0000 UTC m=+174.038436725" Apr 17 11:18:59.451308 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:59.451268 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-k6bbx" event={"ID":"bbded126-d122-4dc3-b0bd-fd03d3ac38e7","Type":"ContainerStarted","Data":"51b2881436c28e0054b90ec5738feb407c544b95958d9486977a7c889a079b56"} Apr 17 11:18:59.467793 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:18:59.467744 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-k6bbx" podStartSLOduration=2.292381045 podStartE2EDuration="4.467729442s" podCreationTimestamp="2026-04-17 11:18:55 +0000 UTC" firstStartedPulling="2026-04-17 11:18:56.502469983 +0000 UTC m=+172.081600719" lastFinishedPulling="2026-04-17 11:18:58.677818388 +0000 UTC m=+174.256949116" observedRunningTime="2026-04-17 11:18:59.466601623 +0000 UTC m=+175.045732371" watchObservedRunningTime="2026-04-17 11:18:59.467729442 +0000 UTC m=+175.046860228" Apr 17 11:19:00.509238 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.509198 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76d5769cb6-d4kv5"] Apr 17 11:19:00.511450 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.511425 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.513523 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.513485 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 11:19:00.513661 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.513557 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 11:19:00.514030 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.514011 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 11:19:00.514178 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.514018 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 11:19:00.514178 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.514071 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bvqk4\"" Apr 17 11:19:00.514310 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.514080 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 11:19:00.514310 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.514081 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 11:19:00.514310 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.514133 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 11:19:00.521159 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.521127 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76d5769cb6-d4kv5"] Apr 17 11:19:00.621689 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.621648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-oauth-config\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.621689 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.621690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvpp\" (UniqueName: \"kubernetes.io/projected/caf44699-05e4-4cb3-b71a-59485125b6f6-kube-api-access-ggvpp\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.621923 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.621718 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-service-ca\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.621923 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.621776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-serving-cert\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.621923 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.621820 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-console-config\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.621923 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.621899 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-oauth-serving-cert\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.723027 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.722990 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-service-ca\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.723027 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.723030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-serving-cert\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.723208 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.723060 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-console-config\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.723208 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.723104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-oauth-serving-cert\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.723208 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.723124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-oauth-config\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.723208 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.723143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvpp\" (UniqueName: \"kubernetes.io/projected/caf44699-05e4-4cb3-b71a-59485125b6f6-kube-api-access-ggvpp\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.723932 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.723903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-oauth-serving-cert\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.723932 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.723919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-console-config\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.724178 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.723943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-service-ca\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.725697 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.725661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-serving-cert\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.725821 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.725659 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-oauth-config\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.733679 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.733634 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvpp\" (UniqueName: \"kubernetes.io/projected/caf44699-05e4-4cb3-b71a-59485125b6f6-kube-api-access-ggvpp\") pod \"console-76d5769cb6-d4kv5\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.822708 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.822608 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:00.925070 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.925031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:19:00.927557 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.927525 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3746d925-d855-4fc1-b6a3-cb7afc7b2395-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zgdf8\" (UID: \"3746d925-d855-4fc1-b6a3-cb7afc7b2395\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:19:00.948865 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:00.948832 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76d5769cb6-d4kv5"] Apr 17 11:19:00.953191 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:19:00.953160 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaf44699_05e4_4cb3_b71a_59485125b6f6.slice/crio-22731e4c43e1f3141d230a05ad19305f92fa433e4f07fa72e26ec09f869b47f0 WatchSource:0}: Error finding container 22731e4c43e1f3141d230a05ad19305f92fa433e4f07fa72e26ec09f869b47f0: Status 404 returned error can't find the container with id 22731e4c43e1f3141d230a05ad19305f92fa433e4f07fa72e26ec09f869b47f0 Apr 17 11:19:01.206197 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:01.206100 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" Apr 17 11:19:01.332786 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:01.332748 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8"] Apr 17 11:19:01.336216 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:19:01.336177 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3746d925_d855_4fc1_b6a3_cb7afc7b2395.slice/crio-3d3b43b1e7d1411aae9de816b202db1af446d39ff3e8fd19f8f0a0a9892be798 WatchSource:0}: Error finding container 3d3b43b1e7d1411aae9de816b202db1af446d39ff3e8fd19f8f0a0a9892be798: Status 404 returned error can't find the container with id 3d3b43b1e7d1411aae9de816b202db1af446d39ff3e8fd19f8f0a0a9892be798 Apr 17 11:19:01.458065 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:01.457974 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76d5769cb6-d4kv5" event={"ID":"caf44699-05e4-4cb3-b71a-59485125b6f6","Type":"ContainerStarted","Data":"22731e4c43e1f3141d230a05ad19305f92fa433e4f07fa72e26ec09f869b47f0"} Apr 17 11:19:01.459038 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:01.459011 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" event={"ID":"3746d925-d855-4fc1-b6a3-cb7afc7b2395","Type":"ContainerStarted","Data":"3d3b43b1e7d1411aae9de816b202db1af446d39ff3e8fd19f8f0a0a9892be798"} Apr 17 11:19:04.471553 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:04.471515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76d5769cb6-d4kv5" event={"ID":"caf44699-05e4-4cb3-b71a-59485125b6f6","Type":"ContainerStarted","Data":"e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e"} Apr 17 11:19:04.472961 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:04.472930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" event={"ID":"3746d925-d855-4fc1-b6a3-cb7afc7b2395","Type":"ContainerStarted","Data":"012f307149eb9d2242c9c9b069ae257f36ee47f414b65aa63e22d57007b285cf"} Apr 17 11:19:04.499901 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:04.499843 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76d5769cb6-d4kv5" podStartSLOduration=1.481705176 podStartE2EDuration="4.499826832s" podCreationTimestamp="2026-04-17 11:19:00 +0000 UTC" firstStartedPulling="2026-04-17 11:19:00.955644242 +0000 UTC m=+176.534774967" lastFinishedPulling="2026-04-17 11:19:03.973765895 +0000 UTC m=+179.552896623" observedRunningTime="2026-04-17 11:19:04.498178219 +0000 UTC m=+180.077309011" watchObservedRunningTime="2026-04-17 11:19:04.499826832 +0000 UTC m=+180.078957580" Apr 17 11:19:04.524025 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:04.523969 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zgdf8" podStartSLOduration=32.923577126 podStartE2EDuration="35.52395443s" podCreationTimestamp="2026-04-17 11:18:29 +0000 UTC" firstStartedPulling="2026-04-17 11:19:01.338105321 +0000 UTC m=+176.917236047" lastFinishedPulling="2026-04-17 11:19:03.93848261 +0000 UTC m=+179.517613351" observedRunningTime="2026-04-17 11:19:04.523781576 +0000 UTC m=+180.102912336" watchObservedRunningTime="2026-04-17 11:19:04.52395443 +0000 UTC m=+180.103085175" Apr 17 11:19:06.493579 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:06.493535 2579 patch_prober.go:28] interesting pod/image-registry-8b9d5b99-7ms26 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:19:06.493944 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:06.493610 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" podUID="bde7b410-7e1b-49fd-9dc0-496cc0d7662c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:19:08.410386 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.410332 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:19:08.453946 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.453911 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v9vzq" Apr 17 11:19:08.551700 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.551662 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfzww"] Apr 17 11:19:08.554431 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.554409 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.556505 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.556478 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 11:19:08.556666 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.556531 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 11:19:08.556666 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.556566 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:19:08.556666 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.556584 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-8dbzh\"" Apr 17 11:19:08.563434 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.563405 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfzww"] Apr 17 11:19:08.700179 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.700082 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc43c19d-0f18-4792-8aac-caf4ce068053-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.700179 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.700170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc43c19d-0f18-4792-8aac-caf4ce068053-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.700393 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.700198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvtsq\" (UniqueName: \"kubernetes.io/projected/dc43c19d-0f18-4792-8aac-caf4ce068053-kube-api-access-xvtsq\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.700393 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.700257 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc43c19d-0f18-4792-8aac-caf4ce068053-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.801341 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.801304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc43c19d-0f18-4792-8aac-caf4ce068053-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.801521 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.801381 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc43c19d-0f18-4792-8aac-caf4ce068053-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.801521 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.801439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc43c19d-0f18-4792-8aac-caf4ce068053-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.801521 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.801477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvtsq\" (UniqueName: \"kubernetes.io/projected/dc43c19d-0f18-4792-8aac-caf4ce068053-kube-api-access-xvtsq\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.802704 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.802668 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc43c19d-0f18-4792-8aac-caf4ce068053-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.803907 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.803864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc43c19d-0f18-4792-8aac-caf4ce068053-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.804538 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.804521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc43c19d-0f18-4792-8aac-caf4ce068053-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.808927 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.808894 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvtsq\" (UniqueName: \"kubernetes.io/projected/dc43c19d-0f18-4792-8aac-caf4ce068053-kube-api-access-xvtsq\") pod \"prometheus-operator-5676c8c784-rfzww\" (UID: \"dc43c19d-0f18-4792-8aac-caf4ce068053\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:08.864211 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:08.864169 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" Apr 17 11:19:09.005628 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.005596 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfzww"] Apr 17 11:19:09.009097 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:19:09.009050 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc43c19d_0f18_4792_8aac_caf4ce068053.slice/crio-c292d13cc4127a267becd556e986d59e1f6f60b69d22f45a3f513422b948dfe3 WatchSource:0}: Error finding container c292d13cc4127a267becd556e986d59e1f6f60b69d22f45a3f513422b948dfe3: Status 404 returned error can't find the container with id c292d13cc4127a267becd556e986d59e1f6f60b69d22f45a3f513422b948dfe3 Apr 17 11:19:09.487243 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.487203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" event={"ID":"dc43c19d-0f18-4792-8aac-caf4ce068053","Type":"ContainerStarted","Data":"c292d13cc4127a267becd556e986d59e1f6f60b69d22f45a3f513422b948dfe3"} Apr 17 11:19:09.498873 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.498840 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74d7467f6c-8bz7t"] Apr 17 11:19:09.501299 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.501266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.508239 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.508211 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 11:19:09.511098 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.511069 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74d7467f6c-8bz7t"] Apr 17 11:19:09.608708 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.608667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-serving-cert\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.608891 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.608738 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52xrz\" (UniqueName: \"kubernetes.io/projected/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-kube-api-access-52xrz\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.608891 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.608795 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-config\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.609005 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.608884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-oauth-serving-cert\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.609005 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.608925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-oauth-config\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.609005 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.608948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-service-ca\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.609152 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.609033 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-trusted-ca-bundle\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.710232 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.710187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-trusted-ca-bundle\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.710464 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.710378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-serving-cert\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.710464 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.710427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52xrz\" (UniqueName: \"kubernetes.io/projected/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-kube-api-access-52xrz\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.710601 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.710477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-config\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.710601 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.710548 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-oauth-serving-cert\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.710601 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.710597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-oauth-config\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.710755 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.710620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-service-ca\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.711325 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.711282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-trusted-ca-bundle\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.711458 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.711291 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-oauth-serving-cert\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.711458 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.711291 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-service-ca\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.711458 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.711438 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-config\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.713014 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.712987 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-serving-cert\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.713181 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.713161 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-oauth-config\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.718926 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.718901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52xrz\" (UniqueName: \"kubernetes.io/projected/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-kube-api-access-52xrz\") pod \"console-74d7467f6c-8bz7t\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:09.815437 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:09.815401 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:10.127444 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.127410 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74d7467f6c-8bz7t"] Apr 17 11:19:10.130792 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:19:10.130760 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c2d7379_c701_4b71_9b51_3f74dd1d8b90.slice/crio-636d2f1aa53fab8c6a0d1f3f45ef48238a63d4a63401311ad5cf9cd967e9c2cd WatchSource:0}: Error finding container 636d2f1aa53fab8c6a0d1f3f45ef48238a63d4a63401311ad5cf9cd967e9c2cd: Status 404 returned error can't find the container with id 636d2f1aa53fab8c6a0d1f3f45ef48238a63d4a63401311ad5cf9cd967e9c2cd Apr 17 11:19:10.491216 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.491118 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" event={"ID":"dc43c19d-0f18-4792-8aac-caf4ce068053","Type":"ContainerStarted","Data":"b386a9684329bc816b0e225173c137e4ccd980937c8d361d7d461134918b2c59"} Apr 17 11:19:10.491216 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.491163 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" event={"ID":"dc43c19d-0f18-4792-8aac-caf4ce068053","Type":"ContainerStarted","Data":"9e8cad6034ba61dbc916f123bccb65dcbbf2b1b0da7c8996ed07c2d0a292fe95"} Apr 17 11:19:10.492570 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.492547 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d7467f6c-8bz7t" event={"ID":"1c2d7379-c701-4b71-9b51-3f74dd1d8b90","Type":"ContainerStarted","Data":"69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622"} Apr 17 11:19:10.492672 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.492574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d7467f6c-8bz7t" event={"ID":"1c2d7379-c701-4b71-9b51-3f74dd1d8b90","Type":"ContainerStarted","Data":"636d2f1aa53fab8c6a0d1f3f45ef48238a63d4a63401311ad5cf9cd967e9c2cd"} Apr 17 11:19:10.511922 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.511871 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfzww" podStartSLOduration=1.4982922379999999 podStartE2EDuration="2.511856214s" podCreationTimestamp="2026-04-17 11:19:08 +0000 UTC" firstStartedPulling="2026-04-17 11:19:09.011133628 +0000 UTC m=+184.590264353" lastFinishedPulling="2026-04-17 11:19:10.0246976 +0000 UTC m=+185.603828329" observedRunningTime="2026-04-17 11:19:10.510763254 +0000 UTC m=+186.089894001" watchObservedRunningTime="2026-04-17 11:19:10.511856214 +0000 UTC m=+186.090986962" Apr 17 11:19:10.535898 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.535850 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74d7467f6c-8bz7t" podStartSLOduration=1.535833418 podStartE2EDuration="1.535833418s" podCreationTimestamp="2026-04-17 11:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:19:10.534947684 +0000 UTC m=+186.114078432" watchObservedRunningTime="2026-04-17 11:19:10.535833418 +0000 UTC m=+186.114964161" Apr 17 11:19:10.822939 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.822898 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:10.823263 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.823236 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:10.828038 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:10.828010 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:11.499170 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:11.499139 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:11.925334 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:11.925301 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-trf74"] Apr 17 11:19:11.927824 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:11.927804 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:11.929992 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:11.929965 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:19:11.930555 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:11.930528 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:19:11.930789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:11.930775 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dblbs\"" Apr 17 11:19:11.931042 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:11.931020 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:19:12.030661 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030622 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.030661 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-root\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.030892 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030747 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-sys\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.030892 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/656c1bde-c18e-4071-a30b-e489b68a76fd-metrics-client-ca\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.030892 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030830 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-textfile\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.030892 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-wtmp\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.031079 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030906 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-accelerators-collector-config\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.031079 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prmch\" (UniqueName: \"kubernetes.io/projected/656c1bde-c18e-4071-a30b-e489b68a76fd-kube-api-access-prmch\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.031079 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.030980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-tls\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132332 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-accelerators-collector-config\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132332 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132334 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prmch\" (UniqueName: \"kubernetes.io/projected/656c1bde-c18e-4071-a30b-e489b68a76fd-kube-api-access-prmch\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132571 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-tls\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132571 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132571 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:19:12.132478 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:19:12.132571 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132495 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-root\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132571 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:19:12.132542 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-tls podName:656c1bde-c18e-4071-a30b-e489b68a76fd nodeName:}" failed. No retries permitted until 2026-04-17 11:19:12.632521135 +0000 UTC m=+188.211651859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-tls") pod "node-exporter-trf74" (UID: "656c1bde-c18e-4071-a30b-e489b68a76fd") : secret "node-exporter-tls" not found Apr 17 11:19:12.132571 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132553 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-root\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-sys\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/656c1bde-c18e-4071-a30b-e489b68a76fd-metrics-client-ca\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132675 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-textfile\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132697 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-sys\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132702 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-wtmp\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.132877 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132821 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-wtmp\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.133086 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.132979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-accelerators-collector-config\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.133086 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.133004 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-textfile\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.133174 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.133158 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/656c1bde-c18e-4071-a30b-e489b68a76fd-metrics-client-ca\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.134964 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.134935 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.151080 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.151056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prmch\" (UniqueName: \"kubernetes.io/projected/656c1bde-c18e-4071-a30b-e489b68a76fd-kube-api-access-prmch\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.636630 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.636590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-tls\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.639169 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.639136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/656c1bde-c18e-4071-a30b-e489b68a76fd-node-exporter-tls\") pod \"node-exporter-trf74\" (UID: \"656c1bde-c18e-4071-a30b-e489b68a76fd\") " pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.837678 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:12.837641 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-trf74" Apr 17 11:19:12.848088 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:19:12.848055 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod656c1bde_c18e_4071_a30b_e489b68a76fd.slice/crio-2359d77fd47d1bf0196535e2e50ec550945e16f7c92421517e59b0b241390cc4 WatchSource:0}: Error finding container 2359d77fd47d1bf0196535e2e50ec550945e16f7c92421517e59b0b241390cc4: Status 404 returned error can't find the container with id 2359d77fd47d1bf0196535e2e50ec550945e16f7c92421517e59b0b241390cc4 Apr 17 11:19:13.502829 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:13.502771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-trf74" event={"ID":"656c1bde-c18e-4071-a30b-e489b68a76fd","Type":"ContainerStarted","Data":"2359d77fd47d1bf0196535e2e50ec550945e16f7c92421517e59b0b241390cc4"} Apr 17 11:19:14.506267 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:14.506225 2579 generic.go:358] "Generic (PLEG): container finished" podID="656c1bde-c18e-4071-a30b-e489b68a76fd" containerID="3d724f6e0d075b42f0f3789768dd0fb3a4c15a4800f1914dbcc5b0d436c2fc2a" exitCode=0 Apr 17 11:19:14.506797 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:14.506322 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-trf74" event={"ID":"656c1bde-c18e-4071-a30b-e489b68a76fd","Type":"ContainerDied","Data":"3d724f6e0d075b42f0f3789768dd0fb3a4c15a4800f1914dbcc5b0d436c2fc2a"} Apr 17 11:19:15.511153 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:15.511117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-trf74" event={"ID":"656c1bde-c18e-4071-a30b-e489b68a76fd","Type":"ContainerStarted","Data":"94614f49454a64cd30c85b4186670d3dbc32795fcd2fa64dc75d02d91ecc444f"} Apr 17 11:19:15.511153 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:15.511161 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-trf74" event={"ID":"656c1bde-c18e-4071-a30b-e489b68a76fd","Type":"ContainerStarted","Data":"3fbeff344c3e6e6f03d1215314b673ea5c2a3f49119a693b8add3330f38f310b"} Apr 17 11:19:15.533291 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:15.533242 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-trf74" podStartSLOduration=3.763880076 podStartE2EDuration="4.533227876s" podCreationTimestamp="2026-04-17 11:19:11 +0000 UTC" firstStartedPulling="2026-04-17 11:19:12.849899368 +0000 UTC m=+188.429030094" lastFinishedPulling="2026-04-17 11:19:13.619247152 +0000 UTC m=+189.198377894" observedRunningTime="2026-04-17 11:19:15.532681338 +0000 UTC m=+191.111812113" watchObservedRunningTime="2026-04-17 11:19:15.533227876 +0000 UTC m=+191.112358623" Apr 17 11:19:18.242176 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.242132 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:19:18.246931 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.246900 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.249777 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.249729 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mdjld\"" Apr 17 11:19:18.249948 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.249930 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 11:19:18.250049 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250016 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 11:19:18.250114 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250041 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e81vcoe8oan11\"" Apr 17 11:19:18.250200 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250184 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 11:19:18.250298 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250220 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 11:19:18.250535 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250515 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 11:19:18.250535 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250527 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 11:19:18.250717 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250562 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 11:19:18.250717 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250606 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 11:19:18.250717 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250610 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 11:19:18.250717 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250518 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 11:19:18.250952 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.250936 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 11:19:18.251190 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.251176 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 11:19:18.252996 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.252970 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 11:19:18.263591 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.263548 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:19:18.289365 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289365 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289343 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289563 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289383 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289563 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nwdl\" (UniqueName: \"kubernetes.io/projected/09ccee58-b7db-440e-abd6-ab0c4ac708e0-kube-api-access-9nwdl\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289563 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/09ccee58-b7db-440e-abd6-ab0c4ac708e0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289667 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-web-config\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289700 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289733 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289769 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289799 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289766 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289799 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289856 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289813 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289893 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289871 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289927 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289960 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289944 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/09ccee58-b7db-440e-abd6-ab0c4ac708e0-config-out\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.289992 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.289975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.290042 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.290026 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-config\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.290075 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.290050 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.390908 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.390874 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391104 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.390915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/09ccee58-b7db-440e-abd6-ab0c4ac708e0-config-out\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391104 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.390937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391104 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-config\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391104 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391378 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391175 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391378 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391378 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391378 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nwdl\" (UniqueName: \"kubernetes.io/projected/09ccee58-b7db-440e-abd6-ab0c4ac708e0-kube-api-access-9nwdl\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391378 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391296 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391378 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391302 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/09ccee58-b7db-440e-abd6-ab0c4ac708e0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-web-config\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391501 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.391686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.392052 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.391699 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.392566 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.392534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.392740 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.392711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.392996 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.392968 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.395137 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.395110 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/09ccee58-b7db-440e-abd6-ab0c4ac708e0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.395392 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.395370 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-config\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.395636 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.395613 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.395712 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.395622 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.395824 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.395794 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.395941 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.395917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/09ccee58-b7db-440e-abd6-ab0c4ac708e0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.396276 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.396230 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/09ccee58-b7db-440e-abd6-ab0c4ac708e0-config-out\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.396513 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.396490 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.396755 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.396732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-web-config\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.396828 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.396803 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.397337 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.397313 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.397651 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.397634 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/09ccee58-b7db-440e-abd6-ab0c4ac708e0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.401547 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.401524 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nwdl\" (UniqueName: \"kubernetes.io/projected/09ccee58-b7db-440e-abd6-ab0c4ac708e0-kube-api-access-9nwdl\") pod \"prometheus-k8s-0\" (UID: \"09ccee58-b7db-440e-abd6-ab0c4ac708e0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.558116 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.558078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:18.714838 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:18.714799 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:19:18.718624 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:19:18.718581 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09ccee58_b7db_440e_abd6_ab0c4ac708e0.slice/crio-8b148422493a795e133aa9d6efea29f0e99a28d3559f201c9a8de85e9545ca37 WatchSource:0}: Error finding container 8b148422493a795e133aa9d6efea29f0e99a28d3559f201c9a8de85e9545ca37: Status 404 returned error can't find the container with id 8b148422493a795e133aa9d6efea29f0e99a28d3559f201c9a8de85e9545ca37 Apr 17 11:19:19.347964 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:19.347922 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8b9d5b99-7ms26"] Apr 17 11:19:19.525231 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:19.525189 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09ccee58-b7db-440e-abd6-ab0c4ac708e0","Type":"ContainerStarted","Data":"8b148422493a795e133aa9d6efea29f0e99a28d3559f201c9a8de85e9545ca37"} Apr 17 11:19:19.816217 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:19.816177 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:19.816428 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:19.816235 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:19.821388 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:19.821361 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:20.529978 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:20.529947 2579 generic.go:358] "Generic (PLEG): container finished" podID="09ccee58-b7db-440e-abd6-ab0c4ac708e0" containerID="d9602559cbb8110d33298cdd581d5e8989b4c85f350d9f57e2432bcf52ea23c7" exitCode=0 Apr 17 11:19:20.530410 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:20.530037 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09ccee58-b7db-440e-abd6-ab0c4ac708e0","Type":"ContainerDied","Data":"d9602559cbb8110d33298cdd581d5e8989b4c85f350d9f57e2432bcf52ea23c7"} Apr 17 11:19:20.534229 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:20.534210 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:20.638032 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:20.637993 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76d5769cb6-d4kv5"] Apr 17 11:19:21.318950 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.318919 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5457b59d48-qx8rd"] Apr 17 11:19:21.322368 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.322328 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.335292 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.335259 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5457b59d48-qx8rd"] Apr 17 11:19:21.417386 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.417330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-console-config\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.417535 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.417413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-trusted-ca-bundle\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.417535 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.417470 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-serving-cert\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.417535 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.417511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-service-ca\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.417653 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.417578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-oauth-config\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.417705 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.417668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrb5\" (UniqueName: \"kubernetes.io/projected/f0d4464c-c569-48dd-a58f-7a416274512b-kube-api-access-tgrb5\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.417705 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.417691 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-oauth-serving-cert\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.519097 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.519063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-oauth-config\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.519287 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.519162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrb5\" (UniqueName: \"kubernetes.io/projected/f0d4464c-c569-48dd-a58f-7a416274512b-kube-api-access-tgrb5\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.519287 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.519196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-oauth-serving-cert\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.519287 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.519232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-console-config\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.519287 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.519251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-trusted-ca-bundle\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.519287 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.519282 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-serving-cert\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.519570 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.519319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-service-ca\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.520192 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.520142 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-console-config\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.520318 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.520249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-oauth-serving-cert\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.520461 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.520419 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-service-ca\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.520775 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.520747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-trusted-ca-bundle\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.521970 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.521940 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-oauth-config\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.522290 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.522264 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-serving-cert\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.532273 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.532237 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrb5\" (UniqueName: \"kubernetes.io/projected/f0d4464c-c569-48dd-a58f-7a416274512b-kube-api-access-tgrb5\") pod \"console-5457b59d48-qx8rd\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.634655 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.634562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:21.785616 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:21.785562 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5457b59d48-qx8rd"] Apr 17 11:19:21.789925 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:19:21.789885 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d4464c_c569_48dd_a58f_7a416274512b.slice/crio-56fc955afc23421d8142bd32689853a2a9f70b1e61ec11ce397df0d16692c6db WatchSource:0}: Error finding container 56fc955afc23421d8142bd32689853a2a9f70b1e61ec11ce397df0d16692c6db: Status 404 returned error can't find the container with id 56fc955afc23421d8142bd32689853a2a9f70b1e61ec11ce397df0d16692c6db Apr 17 11:19:22.538238 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:22.538200 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5457b59d48-qx8rd" event={"ID":"f0d4464c-c569-48dd-a58f-7a416274512b","Type":"ContainerStarted","Data":"b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10"} Apr 17 11:19:22.538692 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:22.538247 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5457b59d48-qx8rd" event={"ID":"f0d4464c-c569-48dd-a58f-7a416274512b","Type":"ContainerStarted","Data":"56fc955afc23421d8142bd32689853a2a9f70b1e61ec11ce397df0d16692c6db"} Apr 17 11:19:22.557311 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:22.557245 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5457b59d48-qx8rd" podStartSLOduration=1.5572240210000001 podStartE2EDuration="1.557224021s" podCreationTimestamp="2026-04-17 11:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:19:22.556039263 +0000 UTC m=+198.135170013" watchObservedRunningTime="2026-04-17 11:19:22.557224021 +0000 UTC m=+198.136354768" Apr 17 11:19:23.550273 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:23.550214 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09ccee58-b7db-440e-abd6-ab0c4ac708e0","Type":"ContainerStarted","Data":"97285fe9a75c727d1b265f29e0e8090c9dbda7590b2bf50164f67e552d6084c4"} Apr 17 11:19:24.554925 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:24.554886 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09ccee58-b7db-440e-abd6-ab0c4ac708e0","Type":"ContainerStarted","Data":"ca85d63a28f07a049288d9d4ef98022906e78f801b90a9b30cba53b1be69bd2a"} Apr 17 11:19:25.561288 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:25.561237 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09ccee58-b7db-440e-abd6-ab0c4ac708e0","Type":"ContainerStarted","Data":"663a59c9c20ca4a6c96ba180b4d381cb5248378c2a2850d6c518943b108f6818"} Apr 17 11:19:25.561288 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:25.561279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09ccee58-b7db-440e-abd6-ab0c4ac708e0","Type":"ContainerStarted","Data":"438374138ef5f32d0412b0e75e96c2016f137664a8ffc18bcdfba7052bd276b1"} Apr 17 11:19:25.561717 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:25.561294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09ccee58-b7db-440e-abd6-ab0c4ac708e0","Type":"ContainerStarted","Data":"963289f14ca4c7320e806b4f6fcf6e88968b5059921f864b2bbccca04d113f08"} Apr 17 11:19:26.569664 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:26.569625 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"09ccee58-b7db-440e-abd6-ab0c4ac708e0","Type":"ContainerStarted","Data":"be35cbcceb53fceaac1051f904c363813cf1b1033aa3e2d8b995c919df0fcad1"} Apr 17 11:19:26.599171 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:26.599110 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.9893902350000001 podStartE2EDuration="8.599087337s" podCreationTimestamp="2026-04-17 11:19:18 +0000 UTC" firstStartedPulling="2026-04-17 11:19:18.720815351 +0000 UTC m=+194.299946076" lastFinishedPulling="2026-04-17 11:19:25.330512444 +0000 UTC m=+200.909643178" observedRunningTime="2026-04-17 11:19:26.59749955 +0000 UTC m=+202.176630295" watchObservedRunningTime="2026-04-17 11:19:26.599087337 +0000 UTC m=+202.178218076" Apr 17 11:19:28.558603 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:28.558567 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:19:31.634811 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:31.634768 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:31.635232 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:31.634822 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:31.639774 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:31.639748 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:32.590740 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:32.590710 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:19:32.651692 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:32.651656 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74d7467f6c-8bz7t"] Apr 17 11:19:44.371878 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.371826 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" podUID="bde7b410-7e1b-49fd-9dc0-496cc0d7662c" containerName="registry" containerID="cri-o://70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40" gracePeriod=30 Apr 17 11:19:44.620154 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.620124 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:19:44.622016 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.621926 2579 generic.go:358] "Generic (PLEG): container finished" podID="bde7b410-7e1b-49fd-9dc0-496cc0d7662c" containerID="70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40" exitCode=0 Apr 17 11:19:44.622016 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.621992 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" Apr 17 11:19:44.622280 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.622017 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" event={"ID":"bde7b410-7e1b-49fd-9dc0-496cc0d7662c","Type":"ContainerDied","Data":"70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40"} Apr 17 11:19:44.622280 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.622059 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8b9d5b99-7ms26" event={"ID":"bde7b410-7e1b-49fd-9dc0-496cc0d7662c","Type":"ContainerDied","Data":"097c37c47f698eda816057016fde247179570aeedd0e73c8b3aad783714b2734"} Apr 17 11:19:44.622280 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.622075 2579 scope.go:117] "RemoveContainer" containerID="70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40" Apr 17 11:19:44.634717 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.634684 2579 scope.go:117] "RemoveContainer" containerID="70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40" Apr 17 11:19:44.635111 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:19:44.635084 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40\": container with ID starting with 70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40 not found: ID does not exist" containerID="70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40" Apr 17 11:19:44.635168 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.635122 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40"} err="failed to get container status \"70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40\": rpc error: code = NotFound desc = could not find container \"70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40\": container with ID starting with 70d6fddec48d3d18a4ecfe538a6bb8916b0ef0ddc1d7bde979cb271e4078ce40 not found: ID does not exist" Apr 17 11:19:44.744813 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.744769 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-certificates\") pod \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " Apr 17 11:19:44.745045 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.744847 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-bound-sa-token\") pod \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " Apr 17 11:19:44.745045 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.744876 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") pod \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " Apr 17 11:19:44.745045 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.744927 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-ca-trust-extracted\") pod \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " Apr 17 11:19:44.745200 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.745076 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dhlr\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-kube-api-access-9dhlr\") pod \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " Apr 17 11:19:44.745200 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.745129 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-image-registry-private-configuration\") pod \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " Apr 17 11:19:44.745200 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.745191 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-trusted-ca\") pod \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " Apr 17 11:19:44.745348 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.745245 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-installation-pull-secrets\") pod \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\" (UID: \"bde7b410-7e1b-49fd-9dc0-496cc0d7662c\") " Apr 17 11:19:44.745463 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.745427 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bde7b410-7e1b-49fd-9dc0-496cc0d7662c" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:44.745770 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.745746 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-certificates\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:44.745930 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.745769 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bde7b410-7e1b-49fd-9dc0-496cc0d7662c" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:44.747744 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.747692 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-kube-api-access-9dhlr" (OuterVolumeSpecName: "kube-api-access-9dhlr") pod "bde7b410-7e1b-49fd-9dc0-496cc0d7662c" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c"). InnerVolumeSpecName "kube-api-access-9dhlr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:44.747896 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.747700 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bde7b410-7e1b-49fd-9dc0-496cc0d7662c" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:44.748067 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.748035 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bde7b410-7e1b-49fd-9dc0-496cc0d7662c" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:44.748190 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.748136 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "bde7b410-7e1b-49fd-9dc0-496cc0d7662c" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:44.748317 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.748295 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bde7b410-7e1b-49fd-9dc0-496cc0d7662c" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:44.755225 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.755186 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bde7b410-7e1b-49fd-9dc0-496cc0d7662c" (UID: "bde7b410-7e1b-49fd-9dc0-496cc0d7662c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:19:44.846608 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.846569 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-bound-sa-token\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:44.846608 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.846604 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-registry-tls\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:44.846608 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.846614 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-ca-trust-extracted\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:44.846821 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.846624 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9dhlr\" (UniqueName: \"kubernetes.io/projected/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-kube-api-access-9dhlr\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:44.846821 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.846634 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-image-registry-private-configuration\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:44.846821 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.846643 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-trusted-ca\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:44.846821 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.846652 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bde7b410-7e1b-49fd-9dc0-496cc0d7662c-installation-pull-secrets\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:44.943538 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.943492 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8b9d5b99-7ms26"] Apr 17 11:19:44.947054 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.947007 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8b9d5b99-7ms26"] Apr 17 11:19:44.984475 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:44.984439 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde7b410-7e1b-49fd-9dc0-496cc0d7662c" path="/var/lib/kubelet/pods/bde7b410-7e1b-49fd-9dc0-496cc0d7662c/volumes" Apr 17 11:19:45.658672 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:45.658629 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76d5769cb6-d4kv5" podUID="caf44699-05e4-4cb3-b71a-59485125b6f6" containerName="console" containerID="cri-o://e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e" gracePeriod=15 Apr 17 11:19:45.904766 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:45.904739 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76d5769cb6-d4kv5_caf44699-05e4-4cb3-b71a-59485125b6f6/console/0.log" Apr 17 11:19:45.904937 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:45.904803 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:46.057226 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057188 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-console-config\") pod \"caf44699-05e4-4cb3-b71a-59485125b6f6\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " Apr 17 11:19:46.057437 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057250 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-oauth-serving-cert\") pod \"caf44699-05e4-4cb3-b71a-59485125b6f6\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " Apr 17 11:19:46.057437 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057279 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggvpp\" (UniqueName: \"kubernetes.io/projected/caf44699-05e4-4cb3-b71a-59485125b6f6-kube-api-access-ggvpp\") pod \"caf44699-05e4-4cb3-b71a-59485125b6f6\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " Apr 17 11:19:46.057437 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057302 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-oauth-config\") pod \"caf44699-05e4-4cb3-b71a-59485125b6f6\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " Apr 17 11:19:46.057437 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057323 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-serving-cert\") pod \"caf44699-05e4-4cb3-b71a-59485125b6f6\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " Apr 17 11:19:46.057651 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057437 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-service-ca\") pod \"caf44699-05e4-4cb3-b71a-59485125b6f6\" (UID: \"caf44699-05e4-4cb3-b71a-59485125b6f6\") " Apr 17 11:19:46.057728 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057697 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-console-config" (OuterVolumeSpecName: "console-config") pod "caf44699-05e4-4cb3-b71a-59485125b6f6" (UID: "caf44699-05e4-4cb3-b71a-59485125b6f6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:46.057852 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057813 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "caf44699-05e4-4cb3-b71a-59485125b6f6" (UID: "caf44699-05e4-4cb3-b71a-59485125b6f6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:46.057985 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.057866 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-service-ca" (OuterVolumeSpecName: "service-ca") pod "caf44699-05e4-4cb3-b71a-59485125b6f6" (UID: "caf44699-05e4-4cb3-b71a-59485125b6f6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:46.059914 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.059880 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "caf44699-05e4-4cb3-b71a-59485125b6f6" (UID: "caf44699-05e4-4cb3-b71a-59485125b6f6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:46.059914 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.059896 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "caf44699-05e4-4cb3-b71a-59485125b6f6" (UID: "caf44699-05e4-4cb3-b71a-59485125b6f6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:46.060083 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.059905 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf44699-05e4-4cb3-b71a-59485125b6f6-kube-api-access-ggvpp" (OuterVolumeSpecName: "kube-api-access-ggvpp") pod "caf44699-05e4-4cb3-b71a-59485125b6f6" (UID: "caf44699-05e4-4cb3-b71a-59485125b6f6"). InnerVolumeSpecName "kube-api-access-ggvpp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:46.158907 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.158866 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-console-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:46.158907 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.158900 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-oauth-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:46.158907 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.158910 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ggvpp\" (UniqueName: \"kubernetes.io/projected/caf44699-05e4-4cb3-b71a-59485125b6f6-kube-api-access-ggvpp\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:46.158907 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.158920 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-oauth-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:46.159181 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.158931 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caf44699-05e4-4cb3-b71a-59485125b6f6-console-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:46.159181 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.158940 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caf44699-05e4-4cb3-b71a-59485125b6f6-service-ca\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:46.630217 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.630188 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76d5769cb6-d4kv5_caf44699-05e4-4cb3-b71a-59485125b6f6/console/0.log" Apr 17 11:19:46.630422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.630233 2579 generic.go:358] "Generic (PLEG): container finished" podID="caf44699-05e4-4cb3-b71a-59485125b6f6" containerID="e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e" exitCode=2 Apr 17 11:19:46.630422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.630320 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76d5769cb6-d4kv5" Apr 17 11:19:46.630422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.630332 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76d5769cb6-d4kv5" event={"ID":"caf44699-05e4-4cb3-b71a-59485125b6f6","Type":"ContainerDied","Data":"e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e"} Apr 17 11:19:46.630422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.630403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76d5769cb6-d4kv5" event={"ID":"caf44699-05e4-4cb3-b71a-59485125b6f6","Type":"ContainerDied","Data":"22731e4c43e1f3141d230a05ad19305f92fa433e4f07fa72e26ec09f869b47f0"} Apr 17 11:19:46.630422 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.630419 2579 scope.go:117] "RemoveContainer" containerID="e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e" Apr 17 11:19:46.639320 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.639300 2579 scope.go:117] "RemoveContainer" containerID="e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e" Apr 17 11:19:46.639699 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:19:46.639675 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e\": container with ID starting with e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e not found: ID does not exist" containerID="e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e" Apr 17 11:19:46.639819 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.639706 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e"} err="failed to get container status \"e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e\": rpc error: code = NotFound desc = could not find container \"e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e\": container with ID starting with e09a1b7cf8af7dfa869e3f5a1312e1ba101bf48017dd9a69e4307855c9887d8e not found: ID does not exist" Apr 17 11:19:46.658813 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.658772 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76d5769cb6-d4kv5"] Apr 17 11:19:46.663231 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.663192 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76d5769cb6-d4kv5"] Apr 17 11:19:46.983941 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:46.983856 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf44699-05e4-4cb3-b71a-59485125b6f6" path="/var/lib/kubelet/pods/caf44699-05e4-4cb3-b71a-59485125b6f6/volumes" Apr 17 11:19:52.655547 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:52.655507 2579 generic.go:358] "Generic (PLEG): container finished" podID="1b1e2d47-0361-4bc8-bc22-76c310dda1e0" containerID="cf6b294eb66588735274f99e6532874c8511adab874a3e728e9be2025e60363b" exitCode=0 Apr 17 11:19:52.655943 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:52.655585 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" event={"ID":"1b1e2d47-0361-4bc8-bc22-76c310dda1e0","Type":"ContainerDied","Data":"cf6b294eb66588735274f99e6532874c8511adab874a3e728e9be2025e60363b"} Apr 17 11:19:52.655943 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:52.655930 2579 scope.go:117] "RemoveContainer" containerID="cf6b294eb66588735274f99e6532874c8511adab874a3e728e9be2025e60363b" Apr 17 11:19:53.660016 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:53.659977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ws96b" event={"ID":"1b1e2d47-0361-4bc8-bc22-76c310dda1e0","Type":"ContainerStarted","Data":"2bf0b31a883840e1b288d512ef316b24b3f6628de4160021b0f10da06735d649"} Apr 17 11:19:57.672182 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:57.672131 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74d7467f6c-8bz7t" podUID="1c2d7379-c701-4b71-9b51-3f74dd1d8b90" containerName="console" containerID="cri-o://69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622" gracePeriod=15 Apr 17 11:19:57.955376 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:57.955327 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d7467f6c-8bz7t_1c2d7379-c701-4b71-9b51-3f74dd1d8b90/console/0.log" Apr 17 11:19:57.955538 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:57.955408 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:58.062996 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.062953 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-config\") pod \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " Apr 17 11:19:58.062996 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063002 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52xrz\" (UniqueName: \"kubernetes.io/projected/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-kube-api-access-52xrz\") pod \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " Apr 17 11:19:58.063250 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063021 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-oauth-serving-cert\") pod \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " Apr 17 11:19:58.063250 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063049 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-service-ca\") pod \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " Apr 17 11:19:58.063250 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063167 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-serving-cert\") pod \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " Apr 17 11:19:58.063250 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063206 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-oauth-config\") pod \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " Apr 17 11:19:58.063468 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063254 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-trusted-ca-bundle\") pod \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\" (UID: \"1c2d7379-c701-4b71-9b51-3f74dd1d8b90\") " Apr 17 11:19:58.063468 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063429 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-config" (OuterVolumeSpecName: "console-config") pod "1c2d7379-c701-4b71-9b51-3f74dd1d8b90" (UID: "1c2d7379-c701-4b71-9b51-3f74dd1d8b90"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:58.063577 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063463 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-service-ca" (OuterVolumeSpecName: "service-ca") pod "1c2d7379-c701-4b71-9b51-3f74dd1d8b90" (UID: "1c2d7379-c701-4b71-9b51-3f74dd1d8b90"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:58.063577 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063467 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1c2d7379-c701-4b71-9b51-3f74dd1d8b90" (UID: "1c2d7379-c701-4b71-9b51-3f74dd1d8b90"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:58.063713 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063693 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:58.063771 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063719 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-oauth-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:58.063771 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063734 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-service-ca\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:58.063863 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.063819 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1c2d7379-c701-4b71-9b51-3f74dd1d8b90" (UID: "1c2d7379-c701-4b71-9b51-3f74dd1d8b90"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:58.065591 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.065551 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1c2d7379-c701-4b71-9b51-3f74dd1d8b90" (UID: "1c2d7379-c701-4b71-9b51-3f74dd1d8b90"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:58.065677 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.065593 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1c2d7379-c701-4b71-9b51-3f74dd1d8b90" (UID: "1c2d7379-c701-4b71-9b51-3f74dd1d8b90"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:58.065677 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.065655 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-kube-api-access-52xrz" (OuterVolumeSpecName: "kube-api-access-52xrz") pod "1c2d7379-c701-4b71-9b51-3f74dd1d8b90" (UID: "1c2d7379-c701-4b71-9b51-3f74dd1d8b90"). InnerVolumeSpecName "kube-api-access-52xrz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:58.164152 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.164105 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52xrz\" (UniqueName: \"kubernetes.io/projected/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-kube-api-access-52xrz\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:58.164152 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.164144 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:58.164152 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.164154 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-console-oauth-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:58.164152 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.164163 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2d7379-c701-4b71-9b51-3f74dd1d8b90-trusted-ca-bundle\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:19:58.675310 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.675281 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d7467f6c-8bz7t_1c2d7379-c701-4b71-9b51-3f74dd1d8b90/console/0.log" Apr 17 11:19:58.675736 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.675322 2579 generic.go:358] "Generic (PLEG): container finished" podID="1c2d7379-c701-4b71-9b51-3f74dd1d8b90" containerID="69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622" exitCode=2 Apr 17 11:19:58.675736 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.675384 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d7467f6c-8bz7t" event={"ID":"1c2d7379-c701-4b71-9b51-3f74dd1d8b90","Type":"ContainerDied","Data":"69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622"} Apr 17 11:19:58.675736 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.675429 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d7467f6c-8bz7t" Apr 17 11:19:58.675736 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.675436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d7467f6c-8bz7t" event={"ID":"1c2d7379-c701-4b71-9b51-3f74dd1d8b90","Type":"ContainerDied","Data":"636d2f1aa53fab8c6a0d1f3f45ef48238a63d4a63401311ad5cf9cd967e9c2cd"} Apr 17 11:19:58.675736 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.675458 2579 scope.go:117] "RemoveContainer" containerID="69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622" Apr 17 11:19:58.684880 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.684854 2579 scope.go:117] "RemoveContainer" containerID="69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622" Apr 17 11:19:58.685236 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:19:58.685216 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622\": container with ID starting with 69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622 not found: ID does not exist" containerID="69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622" Apr 17 11:19:58.685282 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.685250 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622"} err="failed to get container status \"69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622\": rpc error: code = NotFound desc = could not find container \"69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622\": container with ID starting with 69ee5aa83392376416c5a38e19aeac078585cd32a4d1f774ec900124ad198622 not found: ID does not exist" Apr 17 11:19:58.696657 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.696623 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74d7467f6c-8bz7t"] Apr 17 11:19:58.700109 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.700074 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74d7467f6c-8bz7t"] Apr 17 11:19:58.984189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:19:58.984100 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c2d7379-c701-4b71-9b51-3f74dd1d8b90" path="/var/lib/kubelet/pods/1c2d7379-c701-4b71-9b51-3f74dd1d8b90/volumes" Apr 17 11:20:16.926776 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:16.926733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:20:16.929036 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:16.929012 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d87433-8bc9-4d18-bab8-e6f889a4b52d-metrics-certs\") pod \"network-metrics-daemon-n2dw9\" (UID: \"00d87433-8bc9-4d18-bab8-e6f889a4b52d\") " pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:20:17.084003 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:17.083971 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q4j6t\"" Apr 17 11:20:17.091901 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:17.091868 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2dw9" Apr 17 11:20:17.242904 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:17.242865 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n2dw9"] Apr 17 11:20:17.246673 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:20:17.246640 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d87433_8bc9_4d18_bab8_e6f889a4b52d.slice/crio-fb30ec70d8798e05bbe98c31998dcf023f5760f5c9e1bf7acac2ed055c788f2f WatchSource:0}: Error finding container fb30ec70d8798e05bbe98c31998dcf023f5760f5c9e1bf7acac2ed055c788f2f: Status 404 returned error can't find the container with id fb30ec70d8798e05bbe98c31998dcf023f5760f5c9e1bf7acac2ed055c788f2f Apr 17 11:20:17.735265 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:17.735213 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2dw9" event={"ID":"00d87433-8bc9-4d18-bab8-e6f889a4b52d","Type":"ContainerStarted","Data":"fb30ec70d8798e05bbe98c31998dcf023f5760f5c9e1bf7acac2ed055c788f2f"} Apr 17 11:20:18.559068 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:18.558959 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:18.575988 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:18.575949 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:18.739752 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:18.739709 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2dw9" event={"ID":"00d87433-8bc9-4d18-bab8-e6f889a4b52d","Type":"ContainerStarted","Data":"911942b853cc64c019791710dfca9a2c8aaf93325adaaf64064ed4e3fb142f7e"} Apr 17 11:20:18.739752 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:18.739749 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2dw9" event={"ID":"00d87433-8bc9-4d18-bab8-e6f889a4b52d","Type":"ContainerStarted","Data":"a974e899eca28ce96c656220b243f4ee80daa7a6a853b7ce4bafdad0eca3a02d"} Apr 17 11:20:18.755959 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:18.755932 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:20:18.773342 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:18.773282 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n2dw9" podStartSLOduration=252.743236395 podStartE2EDuration="4m13.773266945s" podCreationTimestamp="2026-04-17 11:16:05 +0000 UTC" firstStartedPulling="2026-04-17 11:20:17.249022881 +0000 UTC m=+252.828153606" lastFinishedPulling="2026-04-17 11:20:18.279053427 +0000 UTC m=+253.858184156" observedRunningTime="2026-04-17 11:20:18.771736625 +0000 UTC m=+254.350867398" watchObservedRunningTime="2026-04-17 11:20:18.773266945 +0000 UTC m=+254.352397691" Apr 17 11:20:40.249403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249292 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86c977b9f7-5c9kf"] Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249629 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="caf44699-05e4-4cb3-b71a-59485125b6f6" containerName="console" Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249642 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf44699-05e4-4cb3-b71a-59485125b6f6" containerName="console" Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249656 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c2d7379-c701-4b71-9b51-3f74dd1d8b90" containerName="console" Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249662 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2d7379-c701-4b71-9b51-3f74dd1d8b90" containerName="console" Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249672 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bde7b410-7e1b-49fd-9dc0-496cc0d7662c" containerName="registry" Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249677 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde7b410-7e1b-49fd-9dc0-496cc0d7662c" containerName="registry" Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249732 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="caf44699-05e4-4cb3-b71a-59485125b6f6" containerName="console" Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249741 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="bde7b410-7e1b-49fd-9dc0-496cc0d7662c" containerName="registry" Apr 17 11:20:40.249968 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.249750 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c2d7379-c701-4b71-9b51-3f74dd1d8b90" containerName="console" Apr 17 11:20:40.254007 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.253973 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.263549 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.263514 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86c977b9f7-5c9kf"] Apr 17 11:20:40.424932 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.424873 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-service-ca\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.424932 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.424931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-trusted-ca-bundle\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.425195 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.424964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-oauth-config\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.425195 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.425008 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-oauth-serving-cert\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.425195 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.425083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-serving-cert\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.425195 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.425126 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-config\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.425195 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.425158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qvz6\" (UniqueName: \"kubernetes.io/projected/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-kube-api-access-7qvz6\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.525891 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.525836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-serving-cert\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.525891 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.525881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-config\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.526206 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.525920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qvz6\" (UniqueName: \"kubernetes.io/projected/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-kube-api-access-7qvz6\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.526206 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.525962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-service-ca\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.526206 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.525986 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-trusted-ca-bundle\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.526206 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.526013 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-oauth-config\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.526206 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.526061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-oauth-serving-cert\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.526872 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.526843 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-oauth-serving-cert\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.526995 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.526877 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-service-ca\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.526995 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.526967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-config\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.527095 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.526986 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-trusted-ca-bundle\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.528645 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.528618 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-serving-cert\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.528845 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.528826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-oauth-config\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.538907 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.538873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qvz6\" (UniqueName: \"kubernetes.io/projected/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-kube-api-access-7qvz6\") pod \"console-86c977b9f7-5c9kf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.565594 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.565551 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:40.720402 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.720231 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86c977b9f7-5c9kf"] Apr 17 11:20:40.725518 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:20:40.725480 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cdd6c3f_29e3_491c_b9e6_2cf6c9508bdf.slice/crio-5f76e8d5bafa432c3ff6a8ac5e9f3dc34426883bbaecf31fd3b524df48512fd5 WatchSource:0}: Error finding container 5f76e8d5bafa432c3ff6a8ac5e9f3dc34426883bbaecf31fd3b524df48512fd5: Status 404 returned error can't find the container with id 5f76e8d5bafa432c3ff6a8ac5e9f3dc34426883bbaecf31fd3b524df48512fd5 Apr 17 11:20:40.811778 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.811737 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c977b9f7-5c9kf" event={"ID":"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf","Type":"ContainerStarted","Data":"a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b"} Apr 17 11:20:40.811778 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.811786 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c977b9f7-5c9kf" event={"ID":"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf","Type":"ContainerStarted","Data":"5f76e8d5bafa432c3ff6a8ac5e9f3dc34426883bbaecf31fd3b524df48512fd5"} Apr 17 11:20:40.837330 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:40.837267 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86c977b9f7-5c9kf" podStartSLOduration=0.837249115 podStartE2EDuration="837.249115ms" podCreationTimestamp="2026-04-17 11:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:40.835703184 +0000 UTC m=+276.414833931" watchObservedRunningTime="2026-04-17 11:20:40.837249115 +0000 UTC m=+276.416379862" Apr 17 11:20:50.566609 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:50.566560 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:50.566609 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:50.566610 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:50.571601 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:50.571569 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:50.843478 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:50.843400 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:20:50.881091 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:50.881050 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5457b59d48-qx8rd"] Apr 17 11:20:50.983848 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:50.983816 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-859b59cb8c-g4pjx"] Apr 17 11:20:50.989702 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:50.989679 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:50.993286 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:50.993240 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859b59cb8c-g4pjx"] Apr 17 11:20:51.009559 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.009525 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-service-ca\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.009763 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.009570 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-serving-cert\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.009763 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.009593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-trusted-ca-bundle\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.009763 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.009660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-oauth-config\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.009763 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.009684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-console-config\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.009763 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.009701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzm6n\" (UniqueName: \"kubernetes.io/projected/ecfd6d10-a870-4db5-88fe-9515d2093049-kube-api-access-qzm6n\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.009763 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.009723 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-oauth-serving-cert\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111032 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.110934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-oauth-config\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111032 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.110989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-console-config\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111032 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.111013 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzm6n\" (UniqueName: \"kubernetes.io/projected/ecfd6d10-a870-4db5-88fe-9515d2093049-kube-api-access-qzm6n\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111032 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.111036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-oauth-serving-cert\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111299 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.111086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-service-ca\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111299 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.111118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-serving-cert\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111299 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.111148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-trusted-ca-bundle\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111778 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.111756 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-console-config\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111882 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.111801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-service-ca\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.111882 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.111849 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-oauth-serving-cert\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.112319 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.112302 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-trusted-ca-bundle\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.113740 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.113714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-oauth-config\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.114318 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.114282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-serving-cert\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.119185 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.119158 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzm6n\" (UniqueName: \"kubernetes.io/projected/ecfd6d10-a870-4db5-88fe-9515d2093049-kube-api-access-qzm6n\") pod \"console-859b59cb8c-g4pjx\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.301593 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.301546 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:20:51.431499 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.431450 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859b59cb8c-g4pjx"] Apr 17 11:20:51.434575 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:20:51.434538 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecfd6d10_a870_4db5_88fe_9515d2093049.slice/crio-c06ef585ad46234f8a0543c7803fff80038b34164cbe2f67d87f400b7e86ae20 WatchSource:0}: Error finding container c06ef585ad46234f8a0543c7803fff80038b34164cbe2f67d87f400b7e86ae20: Status 404 returned error can't find the container with id c06ef585ad46234f8a0543c7803fff80038b34164cbe2f67d87f400b7e86ae20 Apr 17 11:20:51.842889 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.842846 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859b59cb8c-g4pjx" event={"ID":"ecfd6d10-a870-4db5-88fe-9515d2093049","Type":"ContainerStarted","Data":"d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd"} Apr 17 11:20:51.842889 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.842894 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859b59cb8c-g4pjx" event={"ID":"ecfd6d10-a870-4db5-88fe-9515d2093049","Type":"ContainerStarted","Data":"c06ef585ad46234f8a0543c7803fff80038b34164cbe2f67d87f400b7e86ae20"} Apr 17 11:20:51.860746 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:20:51.860694 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-859b59cb8c-g4pjx" podStartSLOduration=1.860675385 podStartE2EDuration="1.860675385s" podCreationTimestamp="2026-04-17 11:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:51.859403007 +0000 UTC m=+287.438533770" watchObservedRunningTime="2026-04-17 11:20:51.860675385 +0000 UTC m=+287.439806132" Apr 17 11:21:01.302510 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:01.302458 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:21:01.302510 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:01.302527 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:21:01.307458 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:01.307427 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:21:01.878587 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:01.878542 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:21:01.945226 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:01.945182 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86c977b9f7-5c9kf"] Apr 17 11:21:04.897930 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:04.897898 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:21:04.898382 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:04.897904 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:21:04.902612 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:04.902584 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:21:15.902404 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:15.902322 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5457b59d48-qx8rd" podUID="f0d4464c-c569-48dd-a58f-7a416274512b" containerName="console" containerID="cri-o://b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10" gracePeriod=15 Apr 17 11:21:16.144344 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.144319 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5457b59d48-qx8rd_f0d4464c-c569-48dd-a58f-7a416274512b/console/0.log" Apr 17 11:21:16.144518 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.144407 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:21:16.237865 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.237766 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-console-config\") pod \"f0d4464c-c569-48dd-a58f-7a416274512b\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " Apr 17 11:21:16.237865 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.237829 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-service-ca\") pod \"f0d4464c-c569-48dd-a58f-7a416274512b\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " Apr 17 11:21:16.238101 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.237867 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgrb5\" (UniqueName: \"kubernetes.io/projected/f0d4464c-c569-48dd-a58f-7a416274512b-kube-api-access-tgrb5\") pod \"f0d4464c-c569-48dd-a58f-7a416274512b\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " Apr 17 11:21:16.238101 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.237892 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-oauth-serving-cert\") pod \"f0d4464c-c569-48dd-a58f-7a416274512b\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " Apr 17 11:21:16.238101 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.237924 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-trusted-ca-bundle\") pod \"f0d4464c-c569-48dd-a58f-7a416274512b\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " Apr 17 11:21:16.238101 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.237978 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-serving-cert\") pod \"f0d4464c-c569-48dd-a58f-7a416274512b\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " Apr 17 11:21:16.238101 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.238005 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-oauth-config\") pod \"f0d4464c-c569-48dd-a58f-7a416274512b\" (UID: \"f0d4464c-c569-48dd-a58f-7a416274512b\") " Apr 17 11:21:16.238377 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.238293 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-console-config" (OuterVolumeSpecName: "console-config") pod "f0d4464c-c569-48dd-a58f-7a416274512b" (UID: "f0d4464c-c569-48dd-a58f-7a416274512b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:16.238377 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.238301 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-service-ca" (OuterVolumeSpecName: "service-ca") pod "f0d4464c-c569-48dd-a58f-7a416274512b" (UID: "f0d4464c-c569-48dd-a58f-7a416274512b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:16.238488 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.238459 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f0d4464c-c569-48dd-a58f-7a416274512b" (UID: "f0d4464c-c569-48dd-a58f-7a416274512b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:16.238587 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.238560 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f0d4464c-c569-48dd-a58f-7a416274512b" (UID: "f0d4464c-c569-48dd-a58f-7a416274512b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:16.240400 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.240373 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f0d4464c-c569-48dd-a58f-7a416274512b" (UID: "f0d4464c-c569-48dd-a58f-7a416274512b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:21:16.240400 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.240383 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d4464c-c569-48dd-a58f-7a416274512b-kube-api-access-tgrb5" (OuterVolumeSpecName: "kube-api-access-tgrb5") pod "f0d4464c-c569-48dd-a58f-7a416274512b" (UID: "f0d4464c-c569-48dd-a58f-7a416274512b"). InnerVolumeSpecName "kube-api-access-tgrb5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:21:16.240527 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.240450 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f0d4464c-c569-48dd-a58f-7a416274512b" (UID: "f0d4464c-c569-48dd-a58f-7a416274512b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:21:16.339424 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.339348 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-console-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:16.339424 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.339422 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-service-ca\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:16.339424 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.339431 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tgrb5\" (UniqueName: \"kubernetes.io/projected/f0d4464c-c569-48dd-a58f-7a416274512b-kube-api-access-tgrb5\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:16.339662 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.339441 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-oauth-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:16.339662 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.339451 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0d4464c-c569-48dd-a58f-7a416274512b-trusted-ca-bundle\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:16.339662 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.339460 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:16.339662 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.339468 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0d4464c-c569-48dd-a58f-7a416274512b-console-oauth-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:16.919953 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.919919 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5457b59d48-qx8rd_f0d4464c-c569-48dd-a58f-7a416274512b/console/0.log" Apr 17 11:21:16.920433 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.919968 2579 generic.go:358] "Generic (PLEG): container finished" podID="f0d4464c-c569-48dd-a58f-7a416274512b" containerID="b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10" exitCode=2 Apr 17 11:21:16.920433 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.920036 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5457b59d48-qx8rd" Apr 17 11:21:16.920433 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.920058 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5457b59d48-qx8rd" event={"ID":"f0d4464c-c569-48dd-a58f-7a416274512b","Type":"ContainerDied","Data":"b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10"} Apr 17 11:21:16.920433 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.920099 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5457b59d48-qx8rd" event={"ID":"f0d4464c-c569-48dd-a58f-7a416274512b","Type":"ContainerDied","Data":"56fc955afc23421d8142bd32689853a2a9f70b1e61ec11ce397df0d16692c6db"} Apr 17 11:21:16.920433 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.920115 2579 scope.go:117] "RemoveContainer" containerID="b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10" Apr 17 11:21:16.929531 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.929509 2579 scope.go:117] "RemoveContainer" containerID="b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10" Apr 17 11:21:16.929863 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:21:16.929842 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10\": container with ID starting with b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10 not found: ID does not exist" containerID="b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10" Apr 17 11:21:16.929915 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.929872 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10"} err="failed to get container status \"b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10\": rpc error: code = NotFound desc = could not find container \"b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10\": container with ID starting with b8eb8e601f18bc47c255ca084ad7caa96f54ab57de663b4ffdf6b77a93962e10 not found: ID does not exist" Apr 17 11:21:16.941046 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.941013 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5457b59d48-qx8rd"] Apr 17 11:21:16.946980 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.946942 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5457b59d48-qx8rd"] Apr 17 11:21:16.984117 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:16.984086 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d4464c-c569-48dd-a58f-7a416274512b" path="/var/lib/kubelet/pods/f0d4464c-c569-48dd-a58f-7a416274512b/volumes" Apr 17 11:21:26.966690 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:26.966627 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86c977b9f7-5c9kf" podUID="4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" containerName="console" containerID="cri-o://a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b" gracePeriod=15 Apr 17 11:21:27.210590 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.210560 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c977b9f7-5c9kf_4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf/console/0.log" Apr 17 11:21:27.210754 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.210628 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:21:27.326798 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.326756 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-oauth-serving-cert\") pod \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " Apr 17 11:21:27.327004 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.326816 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qvz6\" (UniqueName: \"kubernetes.io/projected/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-kube-api-access-7qvz6\") pod \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " Apr 17 11:21:27.327004 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.326840 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-oauth-config\") pod \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " Apr 17 11:21:27.327004 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.326856 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-serving-cert\") pod \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " Apr 17 11:21:27.327004 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.326957 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-service-ca\") pod \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " Apr 17 11:21:27.327189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.327049 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-trusted-ca-bundle\") pod \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " Apr 17 11:21:27.327189 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.327118 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-config\") pod \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\" (UID: \"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf\") " Apr 17 11:21:27.327392 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.327327 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" (UID: "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:27.327600 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.327471 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-service-ca" (OuterVolumeSpecName: "service-ca") pod "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" (UID: "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:27.327727 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.327680 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" (UID: "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:27.327727 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.327701 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-config" (OuterVolumeSpecName: "console-config") pod "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" (UID: "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:27.329285 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.329258 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" (UID: "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:21:27.329634 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.329609 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" (UID: "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:21:27.329686 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.329611 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-kube-api-access-7qvz6" (OuterVolumeSpecName: "kube-api-access-7qvz6") pod "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" (UID: "4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf"). InnerVolumeSpecName "kube-api-access-7qvz6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:21:27.428027 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.427970 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:27.428027 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.428021 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-oauth-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:27.428027 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.428032 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qvz6\" (UniqueName: \"kubernetes.io/projected/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-kube-api-access-7qvz6\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:27.428027 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.428043 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-oauth-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:27.428408 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.428053 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-console-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:27.428408 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.428061 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-service-ca\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:27.428408 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.428070 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf-trusted-ca-bundle\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:21:27.959810 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.959780 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c977b9f7-5c9kf_4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf/console/0.log" Apr 17 11:21:27.959998 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.959823 2579 generic.go:358] "Generic (PLEG): container finished" podID="4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" containerID="a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b" exitCode=2 Apr 17 11:21:27.959998 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.959875 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c977b9f7-5c9kf" event={"ID":"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf","Type":"ContainerDied","Data":"a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b"} Apr 17 11:21:27.959998 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.959900 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c977b9f7-5c9kf" event={"ID":"4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf","Type":"ContainerDied","Data":"5f76e8d5bafa432c3ff6a8ac5e9f3dc34426883bbaecf31fd3b524df48512fd5"} Apr 17 11:21:27.959998 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.959916 2579 scope.go:117] "RemoveContainer" containerID="a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b" Apr 17 11:21:27.959998 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.959926 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c977b9f7-5c9kf" Apr 17 11:21:27.969270 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.969084 2579 scope.go:117] "RemoveContainer" containerID="a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b" Apr 17 11:21:27.969555 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:21:27.969444 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b\": container with ID starting with a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b not found: ID does not exist" containerID="a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b" Apr 17 11:21:27.969555 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.969485 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b"} err="failed to get container status \"a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b\": rpc error: code = NotFound desc = could not find container \"a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b\": container with ID starting with a6c1ad41bfa85203fd5e85b1c0579945312434ed9035ace49364a126a0f0636b not found: ID does not exist" Apr 17 11:21:27.981368 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.981309 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86c977b9f7-5c9kf"] Apr 17 11:21:27.987291 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:27.987254 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86c977b9f7-5c9kf"] Apr 17 11:21:28.984241 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:21:28.984199 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" path="/var/lib/kubelet/pods/4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf/volumes" Apr 17 11:25:26.144984 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.144951 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-jclx7"] Apr 17 11:25:26.145598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.145368 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" containerName="console" Apr 17 11:25:26.145598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.145388 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" containerName="console" Apr 17 11:25:26.145598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.145407 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0d4464c-c569-48dd-a58f-7a416274512b" containerName="console" Apr 17 11:25:26.145598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.145415 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d4464c-c569-48dd-a58f-7a416274512b" containerName="console" Apr 17 11:25:26.145598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.145488 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0d4464c-c569-48dd-a58f-7a416274512b" containerName="console" Apr 17 11:25:26.145598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.145501 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cdd6c3f-29e3-491c-b9e6-2cf6c9508bdf" containerName="console" Apr 17 11:25:26.148466 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.148443 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jclx7" Apr 17 11:25:26.150531 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.150511 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 11:25:26.150895 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.150873 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 11:25:26.150997 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.150908 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 11:25:26.150997 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.150921 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-69gzf\"" Apr 17 11:25:26.155878 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.155856 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jclx7"] Apr 17 11:25:26.211428 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.211402 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ktwc\" (UniqueName: \"kubernetes.io/projected/bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37-kube-api-access-2ktwc\") pod \"s3-init-jclx7\" (UID: \"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37\") " pod="kserve/s3-init-jclx7" Apr 17 11:25:26.312464 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.312432 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ktwc\" (UniqueName: \"kubernetes.io/projected/bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37-kube-api-access-2ktwc\") pod \"s3-init-jclx7\" (UID: \"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37\") " pod="kserve/s3-init-jclx7" Apr 17 11:25:26.321789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.321765 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ktwc\" (UniqueName: \"kubernetes.io/projected/bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37-kube-api-access-2ktwc\") pod \"s3-init-jclx7\" (UID: \"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37\") " pod="kserve/s3-init-jclx7" Apr 17 11:25:26.468945 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.468860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jclx7" Apr 17 11:25:26.585646 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.585471 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jclx7"] Apr 17 11:25:26.587891 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:25:26.587867 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf0fdd04_7b9f_4c8b_8a0e_66fde160eb37.slice/crio-0534e1592ffa55140515dc96a61c90d2954ecbfdb5eda26e9e84fef71c565fc1 WatchSource:0}: Error finding container 0534e1592ffa55140515dc96a61c90d2954ecbfdb5eda26e9e84fef71c565fc1: Status 404 returned error can't find the container with id 0534e1592ffa55140515dc96a61c90d2954ecbfdb5eda26e9e84fef71c565fc1 Apr 17 11:25:26.590099 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.590080 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:25:26.641964 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:26.641940 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jclx7" event={"ID":"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37","Type":"ContainerStarted","Data":"0534e1592ffa55140515dc96a61c90d2954ecbfdb5eda26e9e84fef71c565fc1"} Apr 17 11:25:31.658135 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:31.658098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jclx7" event={"ID":"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37","Type":"ContainerStarted","Data":"f2e4d7c68bd220c6bc84b0affd532f1b5237d8813247198fa3965055ce00416a"} Apr 17 11:25:31.673934 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:31.673829 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-jclx7" podStartSLOduration=1.197583834 podStartE2EDuration="5.67381354s" podCreationTimestamp="2026-04-17 11:25:26 +0000 UTC" firstStartedPulling="2026-04-17 11:25:26.590255147 +0000 UTC m=+562.169385877" lastFinishedPulling="2026-04-17 11:25:31.066484847 +0000 UTC m=+566.645615583" observedRunningTime="2026-04-17 11:25:31.673578604 +0000 UTC m=+567.252709352" watchObservedRunningTime="2026-04-17 11:25:31.67381354 +0000 UTC m=+567.252944290" Apr 17 11:25:34.667208 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:34.667171 2579 generic.go:358] "Generic (PLEG): container finished" podID="bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37" containerID="f2e4d7c68bd220c6bc84b0affd532f1b5237d8813247198fa3965055ce00416a" exitCode=0 Apr 17 11:25:34.667208 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:34.667209 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jclx7" event={"ID":"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37","Type":"ContainerDied","Data":"f2e4d7c68bd220c6bc84b0affd532f1b5237d8813247198fa3965055ce00416a"} Apr 17 11:25:35.786365 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:35.786329 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jclx7" Apr 17 11:25:35.887483 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:35.887452 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ktwc\" (UniqueName: \"kubernetes.io/projected/bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37-kube-api-access-2ktwc\") pod \"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37\" (UID: \"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37\") " Apr 17 11:25:35.889473 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:35.889449 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37-kube-api-access-2ktwc" (OuterVolumeSpecName: "kube-api-access-2ktwc") pod "bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37" (UID: "bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37"). InnerVolumeSpecName "kube-api-access-2ktwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:25:35.988370 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:35.988295 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ktwc\" (UniqueName: \"kubernetes.io/projected/bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37-kube-api-access-2ktwc\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:25:36.673550 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:36.673520 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jclx7" Apr 17 11:25:36.673550 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:36.673531 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jclx7" event={"ID":"bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37","Type":"ContainerDied","Data":"0534e1592ffa55140515dc96a61c90d2954ecbfdb5eda26e9e84fef71c565fc1"} Apr 17 11:25:36.673550 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:36.673557 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0534e1592ffa55140515dc96a61c90d2954ecbfdb5eda26e9e84fef71c565fc1" Apr 17 11:25:40.091606 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.091565 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rcjlf/must-gather-dr9f6"] Apr 17 11:25:40.091957 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.091850 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37" containerName="s3-init" Apr 17 11:25:40.091957 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.091860 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37" containerName="s3-init" Apr 17 11:25:40.091957 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.091915 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37" containerName="s3-init" Apr 17 11:25:40.095035 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.095016 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:25:40.097115 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.097092 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rcjlf\"/\"openshift-service-ca.crt\"" Apr 17 11:25:40.097241 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.097163 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rcjlf\"/\"kube-root-ca.crt\"" Apr 17 11:25:40.097419 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.097397 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rcjlf\"/\"default-dockercfg-xlmv4\"" Apr 17 11:25:40.102799 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.102774 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rcjlf/must-gather-dr9f6"] Apr 17 11:25:40.117323 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.117299 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/99583e95-a454-4a19-92d9-534f3ea69b89-must-gather-output\") pod \"must-gather-dr9f6\" (UID: \"99583e95-a454-4a19-92d9-534f3ea69b89\") " pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:25:40.117435 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.117331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjcj\" (UniqueName: \"kubernetes.io/projected/99583e95-a454-4a19-92d9-534f3ea69b89-kube-api-access-5kjcj\") pod \"must-gather-dr9f6\" (UID: \"99583e95-a454-4a19-92d9-534f3ea69b89\") " pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:25:40.218679 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.218645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/99583e95-a454-4a19-92d9-534f3ea69b89-must-gather-output\") pod \"must-gather-dr9f6\" (UID: \"99583e95-a454-4a19-92d9-534f3ea69b89\") " pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:25:40.218679 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.218680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjcj\" (UniqueName: \"kubernetes.io/projected/99583e95-a454-4a19-92d9-534f3ea69b89-kube-api-access-5kjcj\") pod \"must-gather-dr9f6\" (UID: \"99583e95-a454-4a19-92d9-534f3ea69b89\") " pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:25:40.218972 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.218953 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/99583e95-a454-4a19-92d9-534f3ea69b89-must-gather-output\") pod \"must-gather-dr9f6\" (UID: \"99583e95-a454-4a19-92d9-534f3ea69b89\") " pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:25:40.227076 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.227046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjcj\" (UniqueName: \"kubernetes.io/projected/99583e95-a454-4a19-92d9-534f3ea69b89-kube-api-access-5kjcj\") pod \"must-gather-dr9f6\" (UID: \"99583e95-a454-4a19-92d9-534f3ea69b89\") " pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:25:40.412011 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.411923 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:25:40.530326 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.530298 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rcjlf/must-gather-dr9f6"] Apr 17 11:25:40.533531 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:25:40.533499 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99583e95_a454_4a19_92d9_534f3ea69b89.slice/crio-59d5977c54340580fd397de2290cd64f4ab6354788be77d280203e014fcc98dc WatchSource:0}: Error finding container 59d5977c54340580fd397de2290cd64f4ab6354788be77d280203e014fcc98dc: Status 404 returned error can't find the container with id 59d5977c54340580fd397de2290cd64f4ab6354788be77d280203e014fcc98dc Apr 17 11:25:40.684841 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:40.684753 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" event={"ID":"99583e95-a454-4a19-92d9-534f3ea69b89","Type":"ContainerStarted","Data":"59d5977c54340580fd397de2290cd64f4ab6354788be77d280203e014fcc98dc"} Apr 17 11:25:45.702313 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:45.702273 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" event={"ID":"99583e95-a454-4a19-92d9-534f3ea69b89","Type":"ContainerStarted","Data":"6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a"} Apr 17 11:25:45.702313 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:45.702312 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" event={"ID":"99583e95-a454-4a19-92d9-534f3ea69b89","Type":"ContainerStarted","Data":"bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026"} Apr 17 11:25:45.721936 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:45.721884 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" podStartSLOduration=1.4576302079999999 podStartE2EDuration="5.721868572s" podCreationTimestamp="2026-04-17 11:25:40 +0000 UTC" firstStartedPulling="2026-04-17 11:25:40.535068418 +0000 UTC m=+576.114199144" lastFinishedPulling="2026-04-17 11:25:44.79930678 +0000 UTC m=+580.378437508" observedRunningTime="2026-04-17 11:25:45.721642587 +0000 UTC m=+581.300773370" watchObservedRunningTime="2026-04-17 11:25:45.721868572 +0000 UTC m=+581.300999386" Apr 17 11:25:50.163246 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.163201 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fc8f9d94-628vp"] Apr 17 11:25:50.169398 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.169337 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.197549 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.197516 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fc8f9d94-628vp"] Apr 17 11:25:50.206883 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.206847 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-oauth-serving-cert\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.207056 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.206897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-trusted-ca-bundle\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.207056 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.206973 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-service-ca\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.207056 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.207046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-serving-cert\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.207225 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.207115 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-oauth-config\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.207225 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.207144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7d4r\" (UniqueName: \"kubernetes.io/projected/e83829e9-5c5c-4a34-9d7b-6558311c095c-kube-api-access-v7d4r\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.207225 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.207161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-config\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.307587 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.307545 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-oauth-serving-cert\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.307789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.307606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-trusted-ca-bundle\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.307789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.307631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-service-ca\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.307789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.307653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-serving-cert\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.307789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.307696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-oauth-config\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.307789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.307735 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7d4r\" (UniqueName: \"kubernetes.io/projected/e83829e9-5c5c-4a34-9d7b-6558311c095c-kube-api-access-v7d4r\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.307789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.307759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-config\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.308798 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.308764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-service-ca\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.308934 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.308805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-oauth-serving-cert\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.309000 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.308929 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-config\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.309059 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.309039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e83829e9-5c5c-4a34-9d7b-6558311c095c-trusted-ca-bundle\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.310232 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.310204 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-serving-cert\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.310374 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.310311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e83829e9-5c5c-4a34-9d7b-6558311c095c-console-oauth-config\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.316451 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.316417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7d4r\" (UniqueName: \"kubernetes.io/projected/e83829e9-5c5c-4a34-9d7b-6558311c095c-kube-api-access-v7d4r\") pod \"console-6fc8f9d94-628vp\" (UID: \"e83829e9-5c5c-4a34-9d7b-6558311c095c\") " pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.484683 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.484591 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:25:50.626097 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.626036 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fc8f9d94-628vp"] Apr 17 11:25:50.629175 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:25:50.629137 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode83829e9_5c5c_4a34_9d7b_6558311c095c.slice/crio-c058f3bf8a9f6110c70d4a21f740ea8ae800556cc912ad17f5b8a66cb37ad8a3 WatchSource:0}: Error finding container c058f3bf8a9f6110c70d4a21f740ea8ae800556cc912ad17f5b8a66cb37ad8a3: Status 404 returned error can't find the container with id c058f3bf8a9f6110c70d4a21f740ea8ae800556cc912ad17f5b8a66cb37ad8a3 Apr 17 11:25:50.718100 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.718054 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fc8f9d94-628vp" event={"ID":"e83829e9-5c5c-4a34-9d7b-6558311c095c","Type":"ContainerStarted","Data":"6df6670d01ac73547e944a934de1828282416b3ef5caf631553ffc1f44291677"} Apr 17 11:25:50.718100 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.718097 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fc8f9d94-628vp" event={"ID":"e83829e9-5c5c-4a34-9d7b-6558311c095c","Type":"ContainerStarted","Data":"c058f3bf8a9f6110c70d4a21f740ea8ae800556cc912ad17f5b8a66cb37ad8a3"} Apr 17 11:25:50.740927 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:25:50.740803 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fc8f9d94-628vp" podStartSLOduration=0.740783103 podStartE2EDuration="740.783103ms" podCreationTimestamp="2026-04-17 11:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:25:50.739511861 +0000 UTC m=+586.318642612" watchObservedRunningTime="2026-04-17 11:25:50.740783103 +0000 UTC m=+586.319913857" Apr 17 11:26:00.485225 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:00.485183 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:26:00.485735 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:00.485242 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:26:00.489860 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:00.489837 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:26:00.754813 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:00.754780 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fc8f9d94-628vp" Apr 17 11:26:00.802230 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:00.802200 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-859b59cb8c-g4pjx"] Apr 17 11:26:04.921789 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:04.921759 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:26:04.922211 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:04.922115 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:26:05.766478 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:05.766443 2579 generic.go:358] "Generic (PLEG): container finished" podID="99583e95-a454-4a19-92d9-534f3ea69b89" containerID="bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026" exitCode=0 Apr 17 11:26:05.766707 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:05.766525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" event={"ID":"99583e95-a454-4a19-92d9-534f3ea69b89","Type":"ContainerDied","Data":"bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026"} Apr 17 11:26:05.766875 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:05.766861 2579 scope.go:117] "RemoveContainer" containerID="bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026" Apr 17 11:26:06.304469 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:06.304434 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rcjlf_must-gather-dr9f6_99583e95-a454-4a19-92d9-534f3ea69b89/gather/0.log" Apr 17 11:26:09.459731 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:09.459698 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dhjrk_e9ac73cd-3758-404e-bd44-1926d9b9ac58/global-pull-secret-syncer/0.log" Apr 17 11:26:09.607774 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:09.607745 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mhcgl_01b17b1c-2eb7-4ac2-bfa8-1dd57a5dd282/konnectivity-agent/0.log" Apr 17 11:26:09.663610 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:09.663573 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-230.ec2.internal_0ef42997bfb50561dbaa710ba959617d/haproxy/0.log" Apr 17 11:26:11.709132 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:11.709092 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rcjlf/must-gather-dr9f6"] Apr 17 11:26:11.709560 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:11.709290 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" containerName="copy" containerID="cri-o://6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a" gracePeriod=2 Apr 17 11:26:11.713838 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:11.713801 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rcjlf/must-gather-dr9f6"] Apr 17 11:26:11.940907 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:11.940883 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rcjlf_must-gather-dr9f6_99583e95-a454-4a19-92d9-534f3ea69b89/copy/0.log" Apr 17 11:26:11.941190 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:11.941174 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:26:11.942740 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:11.942711 2579 status_manager.go:895] "Failed to get status for pod" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" err="pods \"must-gather-dr9f6\" is forbidden: User \"system:node:ip-10-0-133-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rcjlf\": no relationship found between node 'ip-10-0-133-230.ec2.internal' and this object" Apr 17 11:26:12.107642 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.107604 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kjcj\" (UniqueName: \"kubernetes.io/projected/99583e95-a454-4a19-92d9-534f3ea69b89-kube-api-access-5kjcj\") pod \"99583e95-a454-4a19-92d9-534f3ea69b89\" (UID: \"99583e95-a454-4a19-92d9-534f3ea69b89\") " Apr 17 11:26:12.107800 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.107723 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/99583e95-a454-4a19-92d9-534f3ea69b89-must-gather-output\") pod \"99583e95-a454-4a19-92d9-534f3ea69b89\" (UID: \"99583e95-a454-4a19-92d9-534f3ea69b89\") " Apr 17 11:26:12.109028 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.108994 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99583e95-a454-4a19-92d9-534f3ea69b89-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "99583e95-a454-4a19-92d9-534f3ea69b89" (UID: "99583e95-a454-4a19-92d9-534f3ea69b89"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:26:12.109672 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.109651 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99583e95-a454-4a19-92d9-534f3ea69b89-kube-api-access-5kjcj" (OuterVolumeSpecName: "kube-api-access-5kjcj") pod "99583e95-a454-4a19-92d9-534f3ea69b89" (UID: "99583e95-a454-4a19-92d9-534f3ea69b89"). InnerVolumeSpecName "kube-api-access-5kjcj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:12.208884 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.208845 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5kjcj\" (UniqueName: \"kubernetes.io/projected/99583e95-a454-4a19-92d9-534f3ea69b89-kube-api-access-5kjcj\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:12.208884 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.208877 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/99583e95-a454-4a19-92d9-534f3ea69b89-must-gather-output\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:12.788047 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.788023 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rcjlf_must-gather-dr9f6_99583e95-a454-4a19-92d9-534f3ea69b89/copy/0.log" Apr 17 11:26:12.788469 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.788283 2579 generic.go:358] "Generic (PLEG): container finished" podID="99583e95-a454-4a19-92d9-534f3ea69b89" containerID="6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a" exitCode=143 Apr 17 11:26:12.788469 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.788405 2579 scope.go:117] "RemoveContainer" containerID="6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a" Apr 17 11:26:12.788469 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.788431 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" Apr 17 11:26:12.790185 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.790152 2579 status_manager.go:895] "Failed to get status for pod" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" err="pods \"must-gather-dr9f6\" is forbidden: User \"system:node:ip-10-0-133-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rcjlf\": no relationship found between node 'ip-10-0-133-230.ec2.internal' and this object" Apr 17 11:26:12.796490 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.796452 2579 scope.go:117] "RemoveContainer" containerID="bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026" Apr 17 11:26:12.800865 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.800833 2579 status_manager.go:895] "Failed to get status for pod" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" pod="openshift-must-gather-rcjlf/must-gather-dr9f6" err="pods \"must-gather-dr9f6\" is forbidden: User \"system:node:ip-10-0-133-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rcjlf\": no relationship found between node 'ip-10-0-133-230.ec2.internal' and this object" Apr 17 11:26:12.811383 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.811342 2579 scope.go:117] "RemoveContainer" containerID="6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a" Apr 17 11:26:12.811705 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:26:12.811686 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a\": container with ID starting with 6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a not found: ID does not exist" containerID="6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a" Apr 17 11:26:12.811767 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.811713 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a"} err="failed to get container status \"6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a\": rpc error: code = NotFound desc = could not find container \"6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a\": container with ID starting with 6454339477f75f21c51bd613aaa62b60865ecf6e2c9f155b3216d8cce839719a not found: ID does not exist" Apr 17 11:26:12.811767 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.811730 2579 scope.go:117] "RemoveContainer" containerID="bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026" Apr 17 11:26:12.811979 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:26:12.811961 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026\": container with ID starting with bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026 not found: ID does not exist" containerID="bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026" Apr 17 11:26:12.812019 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.811989 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026"} err="failed to get container status \"bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026\": rpc error: code = NotFound desc = could not find container \"bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026\": container with ID starting with bacdbae3079f863cd2d0292fbbe4a5f5bbac64b7099848e5d4bc3e79fac27026 not found: ID does not exist" Apr 17 11:26:12.986624 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:12.986579 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" path="/var/lib/kubelet/pods/99583e95-a454-4a19-92d9-534f3ea69b89/volumes" Apr 17 11:26:13.195146 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.195112 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zgdf8_3746d925-d855-4fc1-b6a3-cb7afc7b2395/cluster-monitoring-operator/0.log" Apr 17 11:26:13.547103 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.547081 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-trf74_656c1bde-c18e-4071-a30b-e489b68a76fd/node-exporter/0.log" Apr 17 11:26:13.569219 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.569194 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-trf74_656c1bde-c18e-4071-a30b-e489b68a76fd/kube-rbac-proxy/0.log" Apr 17 11:26:13.595159 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.595134 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-trf74_656c1bde-c18e-4071-a30b-e489b68a76fd/init-textfile/0.log" Apr 17 11:26:13.711506 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.711476 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_09ccee58-b7db-440e-abd6-ab0c4ac708e0/prometheus/0.log" Apr 17 11:26:13.735076 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.734986 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_09ccee58-b7db-440e-abd6-ab0c4ac708e0/config-reloader/0.log" Apr 17 11:26:13.761260 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.761235 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_09ccee58-b7db-440e-abd6-ab0c4ac708e0/thanos-sidecar/0.log" Apr 17 11:26:13.786782 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.786755 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_09ccee58-b7db-440e-abd6-ab0c4ac708e0/kube-rbac-proxy-web/0.log" Apr 17 11:26:13.810224 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.810195 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_09ccee58-b7db-440e-abd6-ab0c4ac708e0/kube-rbac-proxy/0.log" Apr 17 11:26:13.835408 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.835378 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_09ccee58-b7db-440e-abd6-ab0c4ac708e0/kube-rbac-proxy-thanos/0.log" Apr 17 11:26:13.860726 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.860705 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_09ccee58-b7db-440e-abd6-ab0c4ac708e0/init-config-reloader/0.log" Apr 17 11:26:13.898579 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.898555 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rfzww_dc43c19d-0f18-4792-8aac-caf4ce068053/prometheus-operator/0.log" Apr 17 11:26:13.923602 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:13.923563 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rfzww_dc43c19d-0f18-4792-8aac-caf4ce068053/kube-rbac-proxy/0.log" Apr 17 11:26:16.090304 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.090269 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fc8f9d94-628vp_e83829e9-5c5c-4a34-9d7b-6558311c095c/console/0.log" Apr 17 11:26:16.113966 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.113938 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-859b59cb8c-g4pjx_ecfd6d10-a870-4db5-88fe-9515d2093049/console/0.log" Apr 17 11:26:16.546688 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.546645 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-bhjx6_c0ab458a-5b75-403d-9929-6914742dd815/volume-data-source-validator/0.log" Apr 17 11:26:16.718013 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.717976 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w"] Apr 17 11:26:16.718338 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.718325 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" containerName="gather" Apr 17 11:26:16.718412 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.718340 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" containerName="gather" Apr 17 11:26:16.718412 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.718348 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" containerName="copy" Apr 17 11:26:16.718412 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.718369 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" containerName="copy" Apr 17 11:26:16.718506 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.718429 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" containerName="gather" Apr 17 11:26:16.718506 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.718440 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="99583e95-a454-4a19-92d9-534f3ea69b89" containerName="copy" Apr 17 11:26:16.721363 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.721326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.723651 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.723625 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkw4c\"/\"openshift-service-ca.crt\"" Apr 17 11:26:16.724163 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.724144 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xkw4c\"/\"default-dockercfg-vtcwd\"" Apr 17 11:26:16.724274 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.724169 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkw4c\"/\"kube-root-ca.crt\"" Apr 17 11:26:16.731072 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.731044 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w"] Apr 17 11:26:16.845965 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.845875 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-podres\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.845965 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.845925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klsfg\" (UniqueName: \"kubernetes.io/projected/dace49a3-79f0-4049-9f63-6c538afe4d4f-kube-api-access-klsfg\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.846158 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.845970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-proc\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.846158 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.845990 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-lib-modules\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.846158 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.846053 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-sys\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947420 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947347 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-lib-modules\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947589 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947483 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-sys\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947589 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-lib-modules\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947589 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947550 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-sys\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947695 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-podres\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947695 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klsfg\" (UniqueName: \"kubernetes.io/projected/dace49a3-79f0-4049-9f63-6c538afe4d4f-kube-api-access-klsfg\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947695 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-proc\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947788 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-podres\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.947788 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.947741 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dace49a3-79f0-4049-9f63-6c538afe4d4f-proc\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:16.956093 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:16.956067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klsfg\" (UniqueName: \"kubernetes.io/projected/dace49a3-79f0-4049-9f63-6c538afe4d4f-kube-api-access-klsfg\") pod \"perf-node-gather-daemonset-kgf2w\" (UID: \"dace49a3-79f0-4049-9f63-6c538afe4d4f\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:17.032383 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.032307 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:17.190286 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.190243 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w"] Apr 17 11:26:17.193992 ip-10-0-133-230 kubenswrapper[2579]: W0417 11:26:17.193959 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddace49a3_79f0_4049_9f63_6c538afe4d4f.slice/crio-34e7d81968dbf3f1e38650659da87585d2aac5d735b1d0efe93530adb8aaaa34 WatchSource:0}: Error finding container 34e7d81968dbf3f1e38650659da87585d2aac5d735b1d0efe93530adb8aaaa34: Status 404 returned error can't find the container with id 34e7d81968dbf3f1e38650659da87585d2aac5d735b1d0efe93530adb8aaaa34 Apr 17 11:26:17.402123 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.402048 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v9vzq_d3d7992a-df2f-42c4-b112-16554731e7e3/dns/0.log" Apr 17 11:26:17.423702 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.423669 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v9vzq_d3d7992a-df2f-42c4-b112-16554731e7e3/kube-rbac-proxy/0.log" Apr 17 11:26:17.493978 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.493946 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w2vns_498d4ba7-6d35-426a-ab1f-fbd6ce54d9bf/dns-node-resolver/0.log" Apr 17 11:26:17.806721 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.806688 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" event={"ID":"dace49a3-79f0-4049-9f63-6c538afe4d4f","Type":"ContainerStarted","Data":"b5e1adfb0dcad335a3e1de0f076a86f7f24c2f3cf6377e8c86fb9152f7b86e85"} Apr 17 11:26:17.806721 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.806722 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" event={"ID":"dace49a3-79f0-4049-9f63-6c538afe4d4f","Type":"ContainerStarted","Data":"34e7d81968dbf3f1e38650659da87585d2aac5d735b1d0efe93530adb8aaaa34"} Apr 17 11:26:17.806930 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.806818 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:17.821488 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.821434 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" podStartSLOduration=1.8213983950000001 podStartE2EDuration="1.821398395s" podCreationTimestamp="2026-04-17 11:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:26:17.820839479 +0000 UTC m=+613.399970227" watchObservedRunningTime="2026-04-17 11:26:17.821398395 +0000 UTC m=+613.400529141" Apr 17 11:26:17.978906 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:17.978876 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-szsmw_6e92a775-6760-473c-a919-b7e7bcf242c5/node-ca/0.log" Apr 17 11:26:18.719530 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:18.719502 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-764bf577f6-vwbj5_88f36c9d-2a1c-4fa7-b48b-0bcca001c665/router/0.log" Apr 17 11:26:19.083914 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:19.083883 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jv8wn_623d8865-4f62-459b-929b-d12c6978284a/serve-healthcheck-canary/0.log" Apr 17 11:26:19.547334 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:19.547303 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-k6bbx_bbded126-d122-4dc3-b0bd-fd03d3ac38e7/kube-rbac-proxy/0.log" Apr 17 11:26:19.573748 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:19.573715 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-k6bbx_bbded126-d122-4dc3-b0bd-fd03d3ac38e7/exporter/0.log" Apr 17 11:26:19.596734 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:19.596708 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-k6bbx_bbded126-d122-4dc3-b0bd-fd03d3ac38e7/extractor/0.log" Apr 17 11:26:21.718531 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:21.718449 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-jclx7_bf0fdd04-7b9f-4c8b-8a0e-66fde160eb37/s3-init/0.log" Apr 17 11:26:23.820077 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:23.820049 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-kgf2w" Apr 17 11:26:25.557108 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:25.557081 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jkm5x_7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0/migrator/0.log" Apr 17 11:26:25.581070 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:25.581042 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jkm5x_7cfd83d0-4383-41e1-9dc0-7aedbd04ddb0/graceful-termination/0.log" Apr 17 11:26:25.824875 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:25.824840 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-859b59cb8c-g4pjx" podUID="ecfd6d10-a870-4db5-88fe-9515d2093049" containerName="console" containerID="cri-o://d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd" gracePeriod=15 Apr 17 11:26:26.069096 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.069073 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-859b59cb8c-g4pjx_ecfd6d10-a870-4db5-88fe-9515d2093049/console/0.log" Apr 17 11:26:26.069239 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.069136 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:26:26.121745 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.121657 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzm6n\" (UniqueName: \"kubernetes.io/projected/ecfd6d10-a870-4db5-88fe-9515d2093049-kube-api-access-qzm6n\") pod \"ecfd6d10-a870-4db5-88fe-9515d2093049\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " Apr 17 11:26:26.121745 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.121727 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-oauth-serving-cert\") pod \"ecfd6d10-a870-4db5-88fe-9515d2093049\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " Apr 17 11:26:26.121971 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.121763 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-service-ca\") pod \"ecfd6d10-a870-4db5-88fe-9515d2093049\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " Apr 17 11:26:26.121971 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.121797 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-serving-cert\") pod \"ecfd6d10-a870-4db5-88fe-9515d2093049\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " Apr 17 11:26:26.121971 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.121823 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-console-config\") pod \"ecfd6d10-a870-4db5-88fe-9515d2093049\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " Apr 17 11:26:26.121971 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.121846 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-oauth-config\") pod \"ecfd6d10-a870-4db5-88fe-9515d2093049\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " Apr 17 11:26:26.121971 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.121892 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-trusted-ca-bundle\") pod \"ecfd6d10-a870-4db5-88fe-9515d2093049\" (UID: \"ecfd6d10-a870-4db5-88fe-9515d2093049\") " Apr 17 11:26:26.122216 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.122110 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ecfd6d10-a870-4db5-88fe-9515d2093049" (UID: "ecfd6d10-a870-4db5-88fe-9515d2093049"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:26.122439 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.122314 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-service-ca" (OuterVolumeSpecName: "service-ca") pod "ecfd6d10-a870-4db5-88fe-9515d2093049" (UID: "ecfd6d10-a870-4db5-88fe-9515d2093049"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:26.122861 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.122829 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ecfd6d10-a870-4db5-88fe-9515d2093049" (UID: "ecfd6d10-a870-4db5-88fe-9515d2093049"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:26.123017 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.122993 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-console-config" (OuterVolumeSpecName: "console-config") pod "ecfd6d10-a870-4db5-88fe-9515d2093049" (UID: "ecfd6d10-a870-4db5-88fe-9515d2093049"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:26.124063 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.124034 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfd6d10-a870-4db5-88fe-9515d2093049-kube-api-access-qzm6n" (OuterVolumeSpecName: "kube-api-access-qzm6n") pod "ecfd6d10-a870-4db5-88fe-9515d2093049" (UID: "ecfd6d10-a870-4db5-88fe-9515d2093049"). InnerVolumeSpecName "kube-api-access-qzm6n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:26.124253 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.124223 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ecfd6d10-a870-4db5-88fe-9515d2093049" (UID: "ecfd6d10-a870-4db5-88fe-9515d2093049"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:26:26.124346 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.124244 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ecfd6d10-a870-4db5-88fe-9515d2093049" (UID: "ecfd6d10-a870-4db5-88fe-9515d2093049"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:26:26.223302 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.223257 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-oauth-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:26.223302 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.223297 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-service-ca\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:26.223302 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.223310 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-serving-cert\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:26.223570 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.223323 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-console-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:26.223570 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.223335 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecfd6d10-a870-4db5-88fe-9515d2093049-console-oauth-config\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:26.223570 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.223347 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecfd6d10-a870-4db5-88fe-9515d2093049-trusted-ca-bundle\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:26.223570 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.223390 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzm6n\" (UniqueName: \"kubernetes.io/projected/ecfd6d10-a870-4db5-88fe-9515d2093049-kube-api-access-qzm6n\") on node \"ip-10-0-133-230.ec2.internal\" DevicePath \"\"" Apr 17 11:26:26.831611 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.831510 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-859b59cb8c-g4pjx_ecfd6d10-a870-4db5-88fe-9515d2093049/console/0.log" Apr 17 11:26:26.831611 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.831583 2579 generic.go:358] "Generic (PLEG): container finished" podID="ecfd6d10-a870-4db5-88fe-9515d2093049" containerID="d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd" exitCode=2 Apr 17 11:26:26.832045 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.831650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859b59cb8c-g4pjx" event={"ID":"ecfd6d10-a870-4db5-88fe-9515d2093049","Type":"ContainerDied","Data":"d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd"} Apr 17 11:26:26.832045 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.831684 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859b59cb8c-g4pjx" event={"ID":"ecfd6d10-a870-4db5-88fe-9515d2093049","Type":"ContainerDied","Data":"c06ef585ad46234f8a0543c7803fff80038b34164cbe2f67d87f400b7e86ae20"} Apr 17 11:26:26.832045 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.831686 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859b59cb8c-g4pjx" Apr 17 11:26:26.832045 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.831700 2579 scope.go:117] "RemoveContainer" containerID="d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd" Apr 17 11:26:26.840667 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.840633 2579 scope.go:117] "RemoveContainer" containerID="d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd" Apr 17 11:26:26.840988 ip-10-0-133-230 kubenswrapper[2579]: E0417 11:26:26.840967 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd\": container with ID starting with d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd not found: ID does not exist" containerID="d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd" Apr 17 11:26:26.841057 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.840999 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd"} err="failed to get container status \"d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd\": rpc error: code = NotFound desc = could not find container \"d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd\": container with ID starting with d3421c4369d0a7b94fc064c5ef0441b5270885d85601bb3144b3f7b3cc6e13fd not found: ID does not exist" Apr 17 11:26:26.852403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.852368 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-859b59cb8c-g4pjx"] Apr 17 11:26:26.856317 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.856286 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-859b59cb8c-g4pjx"] Apr 17 11:26:26.984598 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:26.984565 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfd6d10-a870-4db5-88fe-9515d2093049" path="/var/lib/kubelet/pods/ecfd6d10-a870-4db5-88fe-9515d2093049/volumes" Apr 17 11:26:27.038309 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.038276 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gg86_282d56ec-09a3-4c2e-a098-e8271c1f2147/kube-multus-additional-cni-plugins/0.log" Apr 17 11:26:27.059077 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.059047 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gg86_282d56ec-09a3-4c2e-a098-e8271c1f2147/egress-router-binary-copy/0.log" Apr 17 11:26:27.080731 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.080706 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gg86_282d56ec-09a3-4c2e-a098-e8271c1f2147/cni-plugins/0.log" Apr 17 11:26:27.102117 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.102037 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gg86_282d56ec-09a3-4c2e-a098-e8271c1f2147/bond-cni-plugin/0.log" Apr 17 11:26:27.123545 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.123515 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gg86_282d56ec-09a3-4c2e-a098-e8271c1f2147/routeoverride-cni/0.log" Apr 17 11:26:27.146180 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.146151 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gg86_282d56ec-09a3-4c2e-a098-e8271c1f2147/whereabouts-cni-bincopy/0.log" Apr 17 11:26:27.168676 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.168649 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gg86_282d56ec-09a3-4c2e-a098-e8271c1f2147/whereabouts-cni/0.log" Apr 17 11:26:27.524521 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.524490 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dwf52_44c25c59-0494-490c-9cea-82d3c5d19215/kube-multus/0.log" Apr 17 11:26:27.653394 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.653296 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n2dw9_00d87433-8bc9-4d18-bab8-e6f889a4b52d/network-metrics-daemon/0.log" Apr 17 11:26:27.675573 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:27.675546 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n2dw9_00d87433-8bc9-4d18-bab8-e6f889a4b52d/kube-rbac-proxy/0.log" Apr 17 11:26:28.741995 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.741970 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-controller/0.log" Apr 17 11:26:28.761284 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.761255 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/0.log" Apr 17 11:26:28.764123 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.764094 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovn-acl-logging/1.log" Apr 17 11:26:28.783333 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.783296 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/kube-rbac-proxy-node/0.log" Apr 17 11:26:28.805179 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.805148 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:26:28.822403 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.822378 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/northd/0.log" Apr 17 11:26:28.843544 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.843520 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/nbdb/0.log" Apr 17 11:26:28.866065 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.866041 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/sbdb/0.log" Apr 17 11:26:28.962324 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:28.962286 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8dz7_fe841d65-b3e0-4db9-8f30-f2b6baf3f1c7/ovnkube-controller/0.log" Apr 17 11:26:30.308861 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:30.308832 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cxlg6_26dbb6ee-4d95-468d-aafe-1bc2e96c41f1/network-check-target-container/0.log" Apr 17 11:26:31.256695 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:31.256663 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-n8ccr_02807a83-8fd7-42bf-a119-50f4017a2833/iptables-alerter/0.log" Apr 17 11:26:31.913141 ip-10-0-133-230 kubenswrapper[2579]: I0417 11:26:31.913108 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-c9tz8_df30c3d2-0c10-4a19-94e7-a09f60737213/tuned/0.log"