Apr 24 23:51:12.378659 ip-10-0-140-130 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 23:51:12.378670 ip-10-0-140-130 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 23:51:12.378677 ip-10-0-140-130 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 23:51:12.378939 ip-10-0-140-130 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 23:51:22.420753 ip-10-0-140-130 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 23:51:22.420774 ip-10-0-140-130 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a381c0789784413193a151002ca64d53 -- Apr 24 23:53:34.012881 ip-10-0-140-130 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:34.471092 ip-10-0-140-130 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:34.471092 ip-10-0-140-130 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:34.471092 ip-10-0-140-130 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:34.471092 ip-10-0-140-130 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:34.471092 ip-10-0-140-130 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:34.472977 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.472886 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:34.476003 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.475988 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:34.476003 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476003 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:34.476003 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476006 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476010 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476013 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476016 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476020 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476029 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476032 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476035 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476038 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476041 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476045 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476048 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476051 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476054 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476056 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476059 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476062 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476065 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476069 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:34.476091 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476072 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476075 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476078 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476081 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476084 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476086 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476089 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476091 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476094 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476097 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476100 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476103 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476105 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476108 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476111 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476113 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476116 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476118 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476121 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:34.476596 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476124 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476127 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476129 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476132 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476134 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476137 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476139 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476142 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476144 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476147 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476149 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476152 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476154 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476157 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476159 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476163 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476167 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476169 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476172 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476175 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:34.477354 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476177 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476180 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476182 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476185 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476188 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476190 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476193 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476195 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476197 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476200 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476203 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476206 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476209 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476213 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476215 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476218 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476221 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476223 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476226 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476229 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:34.477870 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476231 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476235 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476238 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476241 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476245 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.476248 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478062 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478073 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478078 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478082 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478085 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478088 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478091 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478094 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478096 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478099 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478102 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478105 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478107 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478110 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:34.478346 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478113 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478116 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478118 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478121 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478123 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478126 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478128 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478131 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478134 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478136 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478139 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478141 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478149 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478152 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478155 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478157 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478160 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478163 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478165 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:34.478839 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478168 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478171 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478174 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478176 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478179 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478182 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478185 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478187 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478190 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478193 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478195 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478198 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478200 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478203 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478205 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478208 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478211 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478213 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478216 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478218 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:34.479306 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478220 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478223 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478225 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478228 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478230 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478233 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478235 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478238 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478240 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478243 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478245 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478248 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478251 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478254 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478257 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478259 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478262 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478266 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478269 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478272 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:34.479825 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478274 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478276 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478279 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478281 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478285 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478288 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478290 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478292 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478295 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478297 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478300 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478302 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.478305 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478383 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478394 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478404 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478410 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478417 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478422 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478427 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478432 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:34.480319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478435 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478438 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478442 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478445 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478449 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478452 2566 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478455 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478458 2566 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478460 2566 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478463 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478466 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478470 2566 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478473 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478476 2566 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478479 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478482 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478487 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478490 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478493 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478497 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478500 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478503 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478506 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478509 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478511 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:34.480850 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478515 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478518 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478521 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478524 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478527 2566 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478530 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478535 2566 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478538 2566 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478541 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478545 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478549 2566 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478553 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478556 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478559 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478562 2566 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478579 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478582 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478585 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478588 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478591 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478594 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478597 2566 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478601 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478604 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478608 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:34.481456 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478612 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478615 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478618 2566 flags.go:64] FLAG: --help="false" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478621 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478624 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478627 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478630 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478634 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478637 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478640 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478642 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478645 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478648 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478651 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478654 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478657 2566 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478665 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478668 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478672 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478675 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478678 2566 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478680 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478683 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478687 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:34.482121 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478692 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478695 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478698 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478701 2566 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478704 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478707 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478710 2566 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478713 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478721 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478725 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478729 2566 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478732 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478735 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478738 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478741 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478744 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478746 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478749 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478757 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478760 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478763 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478766 2566 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478769 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:34.482722 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478774 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478777 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478782 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478785 2566 flags.go:64] FLAG: --port="10250" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478788 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478791 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08c9158a96badcea1" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478795 2566 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478797 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478800 2566 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478803 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478806 2566 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478810 2566 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478813 2566 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478816 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478818 2566 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478822 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478825 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478828 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478832 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478835 2566 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478838 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478841 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478844 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478847 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478851 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478853 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:34.483273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478857 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478860 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478863 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478866 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478868 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478871 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478874 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478877 2566 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478882 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478887 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478890 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478893 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478897 2566 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478900 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478902 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478905 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478908 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478911 2566 flags.go:64] FLAG: --v="2" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478916 2566 flags.go:64] FLAG: --version="false" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478920 2566 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478924 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.478928 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479015 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479018 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:34.483928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479021 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479024 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479027 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479030 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479032 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479035 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479037 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479040 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479043 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479046 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479049 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479051 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479054 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479056 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479059 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479062 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479066 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479069 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479072 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479074 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:34.484521 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479077 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479079 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479082 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479084 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479087 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479089 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479092 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479094 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479097 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479099 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479102 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479106 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479109 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479113 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479116 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479119 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479122 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479124 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479127 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:34.485095 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479130 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479134 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479137 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479139 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479142 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479144 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479146 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479149 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479151 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479155 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479158 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479161 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479163 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479166 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479169 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479171 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479175 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479178 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479181 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479184 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:34.485928 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479186 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479189 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479191 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479194 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479196 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479199 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479201 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479204 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479207 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479210 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479212 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479215 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479217 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479224 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479227 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479230 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479232 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479235 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479237 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479240 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:34.486853 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479243 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479246 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479249 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479252 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.479254 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.479877 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.487184 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.487204 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487299 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487306 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487312 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487317 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487321 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487326 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487330 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487334 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:34.487721 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487338 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487342 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487346 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487350 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487355 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487359 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487363 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487367 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487372 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487376 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487381 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487385 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487389 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487393 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487397 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487404 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487411 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487416 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487421 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:34.488411 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487426 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487431 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487436 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487440 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487446 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487450 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487454 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487459 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487463 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487467 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487471 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487476 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487481 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487485 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487488 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487493 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487497 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487501 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487505 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487510 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:34.488998 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487514 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487519 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487523 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487527 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487531 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487535 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487539 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487543 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487547 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487552 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487556 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487560 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487587 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487591 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487596 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487600 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487604 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487610 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487615 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487620 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:34.489548 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487625 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487629 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487633 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487637 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487641 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487645 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487650 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487654 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487659 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487663 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487667 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487672 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487676 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487680 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487684 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487688 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487692 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487696 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:34.490302 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487702 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.487711 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487899 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487908 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487913 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487919 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487926 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487930 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487935 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487939 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487944 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487948 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487953 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487958 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487962 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:34.490766 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487966 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487971 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487975 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487979 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487984 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487987 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487992 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.487995 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488000 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488003 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488007 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488011 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488016 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488020 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488025 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488029 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488033 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488037 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488041 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488045 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:34.491167 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488049 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488053 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488058 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488063 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488068 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488072 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488076 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488081 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488085 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488089 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488094 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488099 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488103 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488107 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488111 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488115 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488119 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488123 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488127 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488131 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:34.491690 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488136 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488140 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488144 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488149 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488153 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488157 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488161 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488165 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488170 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488175 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488179 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488183 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488188 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488193 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488197 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488201 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488205 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488209 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488213 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:34.492172 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488217 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488223 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488229 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488234 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488247 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488252 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488256 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488260 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488264 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488268 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488273 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488277 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488281 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:34.488285 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.488292 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:34.492678 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.489109 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:34.494195 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.494174 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:34.495100 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.495087 2566 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:34.495205 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.495189 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:34.495244 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.495232 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:34.521829 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.521806 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:34.526557 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.526539 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:34.544670 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.544651 2566 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:34.550055 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.550034 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:34.550809 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.550796 2566 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:34.552872 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.552860 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:34.558768 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.558749 2566 fs.go:135] Filesystem UUIDs: map[1dccab03-9387-4560-a998-050d3defa59f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8958b586-183a-4e51-aaca-0747ce3c6bb6:/dev/nvme0n1p3] Apr 24 23:53:34.558819 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.558769 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:34.564923 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.564813 2566 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:34.56270596 +0000 UTC m=+0.427198242 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099831 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27c20afad499c220818ad73f622c1a SystemUUID:ec27c20a-fad4-99c2-2081-8ad73f622c1a BootID:a381c078-9784-4131-93a1-51002ca64d53 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7a:f0:a8:16:73 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7a:f0:a8:16:73 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7a:fd:31:f0:20:b5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:34.564923 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.564918 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:34.565021 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.564993 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:34.566145 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.566127 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:34.566275 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.566147 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-130.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:34.566315 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.566284 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:34.566315 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.566293 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:34.566315 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.566306 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:34.567052 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.567038 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:34.568282 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.568271 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:34.568391 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.568382 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:34.570684 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.570675 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:34.570715 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.570689 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:34.570715 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.570700 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:34.570715 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.570709 2566 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:34.570819 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.570718 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:34.571816 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.571805 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:34.571855 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.571824 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:34.574998 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.574982 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:34.576305 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.576292 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:34.578233 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578221 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:34.578280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578241 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:34.578280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578252 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:34.578280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578260 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:34.578280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578266 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:34.578280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578272 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:34.578280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578277 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:34.578280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578282 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:34.578476 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578290 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:34.578476 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578296 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:34.578476 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578305 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:34.578476 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.578313 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:34.579167 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.579158 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:34.579167 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.579167 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:34.582283 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.582256 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-130.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 23:53:34.582394 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.582283 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:34.582464 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.582443 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:34.582781 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.582769 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:34.582825 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.582801 2566 server.go:1295] "Started kubelet" Apr 24 23:53:34.582937 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.582887 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:34.582979 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.582893 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:34.583013 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.582985 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:34.584182 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.584159 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:34.584296 ip-10-0-140-130 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:34.585913 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.585893 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:34.591370 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.591342 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:34.591475 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.591347 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:34.592225 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592203 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:34.592320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592229 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:34.592320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592290 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:34.592418 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592371 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:34.592418 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592380 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:34.592508 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.591461 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-130.ec2.internal.18a97020e67fef53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-130.ec2.internal,UID:ip-10-0-140-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-130.ec2.internal,},FirstTimestamp:2026-04-24 23:53:34.582779731 +0000 UTC m=+0.447272009,LastTimestamp:2026-04-24 23:53:34.582779731 +0000 UTC m=+0.447272009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-130.ec2.internal,}" Apr 24 23:53:34.592644 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592617 2566 factory.go:153] Registering CRI-O factory Apr 24 23:53:34.592644 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592641 2566 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:34.592797 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.592674 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:34.592797 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592703 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:34.592797 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592714 2566 factory.go:55] Registering systemd factory Apr 24 23:53:34.592797 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592721 2566 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:34.592797 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592736 2566 factory.go:103] Registering Raw factory Apr 24 23:53:34.592797 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.592748 2566 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:34.593176 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.593160 2566 manager.go:319] Starting recovery of all containers Apr 24 23:53:34.595102 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.595063 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:34.595309 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.595278 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-56cw4" Apr 24 23:53:34.600359 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.600208 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-56cw4" Apr 24 23:53:34.602603 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.602553 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 23:53:34.602603 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.602579 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 23:53:34.604309 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.604295 2566 manager.go:324] Recovery completed Apr 24 23:53:34.608923 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.608912 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:34.611153 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.611139 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:34.611209 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.611167 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:34.611209 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.611182 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:34.611692 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.611677 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:34.611692 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.611691 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:34.611817 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.611709 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:34.615124 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.615110 2566 policy_none.go:49] "None policy: Start" Apr 24 23:53:34.615201 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.615128 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:34.615201 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.615141 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:34.649138 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.649126 2566 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.649157 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.649169 2566 server.go:85] "Starting device plugin registration server" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.649391 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.649400 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.649498 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.649591 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.649600 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.650232 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:34.672208 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.650297 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:34.724499 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.724422 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:34.725742 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.725724 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:34.725848 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.725754 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:34.725848 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.725779 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:34.725848 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.725786 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:34.725848 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.725825 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:34.727953 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.727933 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:34.749521 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.749502 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:34.750533 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.750519 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:34.750625 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.750550 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:34.750625 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.750577 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:34.750625 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.750606 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.759180 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.759162 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.759228 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.759188 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-130.ec2.internal\": node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:34.776941 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.776918 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:34.825912 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.825881 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal"] Apr 24 23:53:34.826020 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.825952 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:34.827246 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.827230 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:34.827328 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.827264 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:34.827328 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.827277 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:34.829412 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.829398 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:34.829541 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.829527 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.829606 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.829555 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:34.830145 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.830131 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:34.830215 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.830154 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:34.830215 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.830172 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:34.830215 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.830181 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:34.830308 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.830158 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:34.830308 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.830242 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:34.832425 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.832410 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.832494 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.832441 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:34.833518 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.833504 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:34.833608 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.833528 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:34.833608 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.833539 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:34.859114 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.859098 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-130.ec2.internal\" not found" node="ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.863254 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.863238 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-130.ec2.internal\" not found" node="ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.877685 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.877671 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:34.894667 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.894648 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18921f8b148e5eaa8f9152ede9eff20a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal\" (UID: \"18921f8b148e5eaa8f9152ede9eff20a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.894727 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.894672 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18921f8b148e5eaa8f9152ede9eff20a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal\" (UID: \"18921f8b148e5eaa8f9152ede9eff20a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.894727 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.894694 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4f4adde147201157e73b34681d8e0de6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-130.ec2.internal\" (UID: \"4f4adde147201157e73b34681d8e0de6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.978492 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:34.978420 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:34.995817 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.995798 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18921f8b148e5eaa8f9152ede9eff20a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal\" (UID: \"18921f8b148e5eaa8f9152ede9eff20a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.995878 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.995830 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18921f8b148e5eaa8f9152ede9eff20a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal\" (UID: \"18921f8b148e5eaa8f9152ede9eff20a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.995878 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.995852 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4f4adde147201157e73b34681d8e0de6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-130.ec2.internal\" (UID: \"4f4adde147201157e73b34681d8e0de6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.995878 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.995867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18921f8b148e5eaa8f9152ede9eff20a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal\" (UID: \"18921f8b148e5eaa8f9152ede9eff20a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.995982 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.995884 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18921f8b148e5eaa8f9152ede9eff20a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal\" (UID: \"18921f8b148e5eaa8f9152ede9eff20a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:34.995982 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:34.995906 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4f4adde147201157e73b34681d8e0de6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-130.ec2.internal\" (UID: \"4f4adde147201157e73b34681d8e0de6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" Apr 24 23:53:35.079138 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.079105 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.160662 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.160636 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:35.165244 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.165225 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" Apr 24 23:53:35.179927 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.179906 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.280598 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.280489 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.381053 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.381010 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.481737 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.481696 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.495098 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.495079 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:35.495222 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.495207 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:35.581810 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.581770 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.591704 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.591684 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:35.602974 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.602947 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:34 +0000 UTC" deadline="2027-09-29 06:26:12.825944084 +0000 UTC" Apr 24 23:53:35.602974 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.602970 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12534h32m37.222976543s" Apr 24 23:53:35.606276 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.606259 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:35.629261 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.629235 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mfwq8" Apr 24 23:53:35.634745 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.634730 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mfwq8" Apr 24 23:53:35.675601 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:35.675547 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18921f8b148e5eaa8f9152ede9eff20a.slice/crio-680af56adac23f47dca8509d215e6e38d500b6c944f4676bd7e9e93b9e8cdf53 WatchSource:0}: Error finding container 680af56adac23f47dca8509d215e6e38d500b6c944f4676bd7e9e93b9e8cdf53: Status 404 returned error can't find the container with id 680af56adac23f47dca8509d215e6e38d500b6c944f4676bd7e9e93b9e8cdf53 Apr 24 23:53:35.676156 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:35.676126 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f4adde147201157e73b34681d8e0de6.slice/crio-4d011ea0953a2289e5e8f4807809d6c90fab2a738e3fe7688e20d7d6c2bb636c WatchSource:0}: Error finding container 4d011ea0953a2289e5e8f4807809d6c90fab2a738e3fe7688e20d7d6c2bb636c: Status 404 returned error can't find the container with id 4d011ea0953a2289e5e8f4807809d6c90fab2a738e3fe7688e20d7d6c2bb636c Apr 24 23:53:35.680930 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.680908 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:35.682506 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.682490 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.718460 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.718437 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:35.728166 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.728129 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" event={"ID":"18921f8b148e5eaa8f9152ede9eff20a","Type":"ContainerStarted","Data":"680af56adac23f47dca8509d215e6e38d500b6c944f4676bd7e9e93b9e8cdf53"} Apr 24 23:53:35.728966 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.728943 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" event={"ID":"4f4adde147201157e73b34681d8e0de6","Type":"ContainerStarted","Data":"4d011ea0953a2289e5e8f4807809d6c90fab2a738e3fe7688e20d7d6c2bb636c"} Apr 24 23:53:35.772324 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:35.772293 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:35.783260 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.783236 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.883800 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.883728 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:35.984262 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:35.984236 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:36.085001 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.084969 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-130.ec2.internal\" not found" Apr 24 23:53:36.118088 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.118060 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:36.192619 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.192503 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" Apr 24 23:53:36.206030 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.205916 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:36.207127 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.207005 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" Apr 24 23:53:36.213139 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.213120 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:36.532704 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.532629 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:36.571728 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.571699 2566 apiserver.go:52] "Watching apiserver" Apr 24 23:53:36.578782 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.578750 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:36.580967 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.580937 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wf82j","openshift-ovn-kubernetes/ovnkube-node-w7lm4","openshift-cluster-node-tuning-operator/tuned-brxjn","openshift-image-registry/node-ca-nqwjk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal","openshift-multus/multus-additional-cni-plugins-44g68","openshift-multus/multus-fs6wc","openshift-network-diagnostics/network-check-target-v8dd2","openshift-network-operator/iptables-alerter-cgg9j","kube-system/konnectivity-agent-r4l6n","kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9"] Apr 24 23:53:36.583302 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.583267 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:36.583713 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.583591 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:36.585872 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.585851 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.588154 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.588132 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.588487 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.588307 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:36.588487 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.588322 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:36.588487 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.588354 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:36.588487 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.588484 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:36.588756 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.588496 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-94qv5\"" Apr 24 23:53:36.589146 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.589128 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:36.589360 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.589344 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:36.590216 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.590194 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:36.590336 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.590318 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.590923 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.590907 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xvdhn\"" Apr 24 23:53:36.590923 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.590922 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:36.592829 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.592809 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:36.592927 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.592870 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:36.592987 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.592814 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8s2v9\"" Apr 24 23:53:36.593077 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.593062 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:36.595030 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.595012 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.597144 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.597127 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.597274 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.597253 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:36.597356 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.597335 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:36.597398 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.597259 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hhhm8\"" Apr 24 23:53:36.597480 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.597466 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:36.597520 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.597488 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:36.597695 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.597676 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:36.597767 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.597752 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:36.597867 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.597823 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:36.599545 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.599515 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:36.599788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.599660 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.600104 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.600083 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9nv4r\"" Apr 24 23:53:36.601943 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.601922 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:36.602034 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.602003 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:36.602165 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.602149 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:36.602270 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.602254 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n7rsm\"" Apr 24 23:53:36.602341 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.602329 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:36.604398 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.604234 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qr6ml"] Apr 24 23:53:36.604398 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.604296 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:36.604398 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.604323 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:36.604398 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.604354 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dcgnw\"" Apr 24 23:53:36.604398 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.604379 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.606329 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606309 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-run-netns\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.606429 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606344 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tkjx\" (UniqueName: \"kubernetes.io/projected/b4228f03-25e8-4a96-b72d-5f9fa76ee207-kube-api-access-2tkjx\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.606429 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606376 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysctl-d\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.606429 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606408 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovnkube-config\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.606595 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606446 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-env-overrides\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.606595 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606510 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-sys\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.606595 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606535 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-host\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.606595 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-kubelet\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.606788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606601 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:36.606788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606643 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-serviceca\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.606788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606706 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:36.606788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606714 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-os-release\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.606788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606757 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cni-binary-copy\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.607002 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606950 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:36.607002 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.606958 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7zkw6\"" Apr 24 23:53:36.607114 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607074 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.607216 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607117 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:36.607216 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607126 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.607216 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607167 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-k8s-cni-cncf-io\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.607352 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.607166 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:36.607352 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607231 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-netns\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.607352 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607253 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-log-socket\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.607352 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607304 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-run-ovn-kubernetes\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.607352 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-cni-bin\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.607598 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607352 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.607598 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607378 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/349e5253-376d-444a-b099-86d3fb1b6b37-etc-tuned\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.607598 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607402 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.607598 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607430 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-cni-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.607598 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607479 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.607598 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607508 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5zh2\" (UniqueName: \"kubernetes.io/projected/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-kube-api-access-d5zh2\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.607598 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607540 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-tuning-conf-dir\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.607598 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607562 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsrhs\" (UniqueName: \"kubernetes.io/projected/93b15e14-7d7c-4b19-bb25-2e48ae26af80-kube-api-access-gsrhs\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607649 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-hostroot\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607673 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-kubelet\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607693 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-host\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607721 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-os-release\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607812 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-lib-modules\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607838 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-conf-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607861 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-modprobe-d\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607900 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysconfig\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.607936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607929 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84tfq\" (UniqueName: \"kubernetes.io/projected/e101d25b-89b6-4522-8e39-35b94ce4d935-kube-api-access-84tfq\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607946 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-var-lib-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607975 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-cni-netd\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.607997 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovn-node-metrics-cert\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608025 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-systemd\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608061 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11e92e50-ca0e-4133-9e99-69897f47de51-multus-daemon-config\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608093 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-etc-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608116 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhql7\" (UniqueName: \"kubernetes.io/projected/11e92e50-ca0e-4133-9e99-69897f47de51-kube-api-access-jhql7\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608134 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-kubernetes\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608171 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cnibin\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608201 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11e92e50-ca0e-4133-9e99-69897f47de51-cni-binary-copy\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608227 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysctl-conf\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608252 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-cni-multus\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608276 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-slash\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608299 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrxm\" (UniqueName: \"kubernetes.io/projected/349e5253-376d-444a-b099-86d3fb1b6b37-kube-api-access-dlrxm\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.608320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608322 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-multus-certs\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608344 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-system-cni-dir\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608368 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-system-cni-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608390 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-systemd-units\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608412 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovnkube-script-lib\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608445 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-var-lib-kubelet\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608488 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/349e5253-376d-444a-b099-86d3fb1b6b37-tmp\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608528 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-cnibin\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608589 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-systemd\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608645 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-run\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608676 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-socket-dir-parent\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608731 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-cni-bin\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608763 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-etc-kubernetes\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608784 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-ovn\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.609035 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.608876 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-node-log\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.635312 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.635278 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:35 +0000 UTC" deadline="2027-11-16 19:30:39.980563902 +0000 UTC" Apr 24 23:53:36.635312 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.635311 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13699h37m3.345256004s" Apr 24 23:53:36.693796 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.693772 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:36.709822 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709785 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-kubernetes\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.709972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709829 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cnibin\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.709972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709849 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11e92e50-ca0e-4133-9e99-69897f47de51-cni-binary-copy\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.709972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709880 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-registration-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.709972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709894 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-kubernetes\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.709972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709915 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysctl-conf\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.709972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709942 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-cni-multus\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.709972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709953 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cnibin\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.709972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709964 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-device-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.709987 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-slash\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710002 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrxm\" (UniqueName: \"kubernetes.io/projected/349e5253-376d-444a-b099-86d3fb1b6b37-kube-api-access-dlrxm\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710009 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-cni-multus\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710021 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-multus-certs\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710046 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e4741ff1-3c93-48d0-8650-7f35e738b042-konnectivity-ca\") pod \"konnectivity-agent-r4l6n\" (UID: \"e4741ff1-3c93-48d0-8650-7f35e738b042\") " pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710059 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysctl-conf\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710075 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-slash\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710102 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-multus-certs\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710180 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710221 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-system-cni-dir\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710251 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-system-cni-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710283 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5384178-0a8f-4c23-96ba-bcbe045f676c-dbus\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710263 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-system-cni-dir\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710309 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-systemd-units\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710337 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovnkube-script-lib\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.710346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710338 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-system-cni-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710347 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-systemd-units\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710364 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-var-lib-kubelet\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710415 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/349e5253-376d-444a-b099-86d3fb1b6b37-tmp\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710417 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-var-lib-kubelet\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710440 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-cnibin\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710468 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-socket-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-systemd\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710519 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-run\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710546 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-socket-dir-parent\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710592 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-cnibin\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710635 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-systemd\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710644 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-run\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710593 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-cni-bin\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-etc-kubernetes\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710690 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-socket-dir-parent\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710518 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11e92e50-ca0e-4133-9e99-69897f47de51-cni-binary-copy\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710706 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-ovn\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711015 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-node-log\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710750 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-cni-bin\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710749 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710750 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-etc-kubernetes\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710791 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-ovn\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710797 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-node-log\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710793 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-run-netns\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710754 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-run-netns\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710840 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tkjx\" (UniqueName: \"kubernetes.io/projected/b4228f03-25e8-4a96-b72d-5f9fa76ee207-kube-api-access-2tkjx\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysctl-d\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710891 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-sys-fs\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-iptables-alerter-script\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovnkube-config\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710965 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovnkube-script-lib\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710986 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysctl-d\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.710967 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-env-overrides\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711096 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-sys\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711201 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-host\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.711874 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711208 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-sys\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711238 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-kubelet\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711265 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711289 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-serviceca\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711297 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-var-lib-kubelet\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711266 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-host\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711313 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-os-release\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711343 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-env-overrides\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.711376 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711368 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cni-binary-copy\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711387 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-os-release\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711410 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.711465 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:37.211426819 +0000 UTC m=+3.075919102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711484 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-k8s-cni-cncf-io\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711510 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-netns\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711580 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-log-socket\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.712627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-run-ovn-kubernetes\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711628 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-cni-bin\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711653 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711645 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovnkube-config\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/349e5253-376d-444a-b099-86d3fb1b6b37-etc-tuned\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-serviceca\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711701 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-netns\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711708 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711714 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-run-ovn-kubernetes\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711734 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-cni-bin\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711736 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-cni-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711762 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-log-socket\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711776 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-host-run-k8s-cni-cncf-io\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711779 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711810 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-run-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711791 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-cni-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.713281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711823 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5zh2\" (UniqueName: \"kubernetes.io/projected/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-kube-api-access-d5zh2\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711852 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-tuning-conf-dir\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711877 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsrhs\" (UniqueName: \"kubernetes.io/projected/93b15e14-7d7c-4b19-bb25-2e48ae26af80-kube-api-access-gsrhs\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711900 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-hostroot\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711928 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e4741ff1-3c93-48d0-8650-7f35e738b042-agent-certs\") pod \"konnectivity-agent-r4l6n\" (UID: \"e4741ff1-3c93-48d0-8650-7f35e738b042\") " pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711956 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlm28\" (UniqueName: \"kubernetes.io/projected/a7885794-c248-4c2f-8880-b5b7573c5cfc-kube-api-access-wlm28\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711971 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cni-binary-copy\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.711980 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5384178-0a8f-4c23-96ba-bcbe045f676c-kubelet-config\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712004 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-kubelet\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712031 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-hostroot\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712029 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-host\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712074 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-os-release\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712099 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712127 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-host\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712148 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-host-slash\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712153 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93b15e14-7d7c-4b19-bb25-2e48ae26af80-tuning-conf-dir\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712166 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-kubelet\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.714000 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712178 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ftb2\" (UniqueName: \"kubernetes.io/projected/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-kube-api-access-5ftb2\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712210 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-lib-modules\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712234 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-conf-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712238 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712266 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-os-release\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712280 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11e92e50-ca0e-4133-9e99-69897f47de51-multus-conf-dir\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712310 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-modprobe-d\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712387 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysconfig\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712399 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-lib-modules\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712407 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-modprobe-d\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712415 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84tfq\" (UniqueName: \"kubernetes.io/projected/e101d25b-89b6-4522-8e39-35b94ce4d935-kube-api-access-84tfq\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712445 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-sysconfig\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712446 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-var-lib-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-cni-netd\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712482 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-var-lib-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712511 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovn-node-metrics-cert\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712536 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-systemd\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.714887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712549 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-host-cni-netd\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712577 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11e92e50-ca0e-4133-9e99-69897f47de51-multus-daemon-config\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712615 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-etc-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712634 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/349e5253-376d-444a-b099-86d3fb1b6b37-etc-systemd\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712641 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhql7\" (UniqueName: \"kubernetes.io/projected/11e92e50-ca0e-4133-9e99-69897f47de51-kube-api-access-jhql7\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712675 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4228f03-25e8-4a96-b72d-5f9fa76ee207-etc-openvswitch\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.712670 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-etc-selinux\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.713012 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/93b15e14-7d7c-4b19-bb25-2e48ae26af80-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.713632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11e92e50-ca0e-4133-9e99-69897f47de51-multus-daemon-config\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.714378 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/349e5253-376d-444a-b099-86d3fb1b6b37-tmp\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.714508 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/349e5253-376d-444a-b099-86d3fb1b6b37-etc-tuned\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.715708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.714999 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4228f03-25e8-4a96-b72d-5f9fa76ee207-ovn-node-metrics-cert\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.723641 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.723526 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrxm\" (UniqueName: \"kubernetes.io/projected/349e5253-376d-444a-b099-86d3fb1b6b37-kube-api-access-dlrxm\") pod \"tuned-brxjn\" (UID: \"349e5253-376d-444a-b099-86d3fb1b6b37\") " pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.724235 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.724168 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tkjx\" (UniqueName: \"kubernetes.io/projected/b4228f03-25e8-4a96-b72d-5f9fa76ee207-kube-api-access-2tkjx\") pod \"ovnkube-node-w7lm4\" (UID: \"b4228f03-25e8-4a96-b72d-5f9fa76ee207\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.725222 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.725176 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsrhs\" (UniqueName: \"kubernetes.io/projected/93b15e14-7d7c-4b19-bb25-2e48ae26af80-kube-api-access-gsrhs\") pod \"multus-additional-cni-plugins-44g68\" (UID: \"93b15e14-7d7c-4b19-bb25-2e48ae26af80\") " pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.725310 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.725267 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84tfq\" (UniqueName: \"kubernetes.io/projected/e101d25b-89b6-4522-8e39-35b94ce4d935-kube-api-access-84tfq\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:36.725510 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.725490 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhql7\" (UniqueName: \"kubernetes.io/projected/11e92e50-ca0e-4133-9e99-69897f47de51-kube-api-access-jhql7\") pod \"multus-fs6wc\" (UID: \"11e92e50-ca0e-4133-9e99-69897f47de51\") " pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.726186 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.726165 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5zh2\" (UniqueName: \"kubernetes.io/projected/cdb63bec-b61b-4953-b2fc-7f06ee7063ac-kube-api-access-d5zh2\") pod \"node-ca-nqwjk\" (UID: \"cdb63bec-b61b-4953-b2fc-7f06ee7063ac\") " pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.813529 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813455 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-socket-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.813529 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813499 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-sys-fs\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.813529 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813522 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-iptables-alerter-script\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813581 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813609 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-sys-fs\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813616 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e4741ff1-3c93-48d0-8650-7f35e738b042-agent-certs\") pod \"konnectivity-agent-r4l6n\" (UID: \"e4741ff1-3c93-48d0-8650-7f35e738b042\") " pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813674 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlm28\" (UniqueName: \"kubernetes.io/projected/a7885794-c248-4c2f-8880-b5b7573c5cfc-kube-api-access-wlm28\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-socket-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813704 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5384178-0a8f-4c23-96ba-bcbe045f676c-kubelet-config\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813735 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813762 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-host-slash\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.813767 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:36.813791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813785 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ftb2\" (UniqueName: \"kubernetes.io/projected/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-kube-api-access-5ftb2\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813799 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5384178-0a8f-4c23-96ba-bcbe045f676c-kubelet-config\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813820 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-etc-selinux\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.813847 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret podName:c5384178-0a8f-4c23-96ba-bcbe045f676c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:37.313825751 +0000 UTC m=+3.178318033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret") pod "global-pull-secret-syncer-qr6ml" (UID: "c5384178-0a8f-4c23-96ba-bcbe045f676c") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813860 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-host-slash\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813905 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-etc-selinux\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.813990 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-registration-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814022 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-device-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814063 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e4741ff1-3c93-48d0-8650-7f35e738b042-konnectivity-ca\") pod \"konnectivity-agent-r4l6n\" (UID: \"e4741ff1-3c93-48d0-8650-7f35e738b042\") " pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814072 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-registration-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814085 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-iptables-alerter-script\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814088 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814113 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a7885794-c248-4c2f-8880-b5b7573c5cfc-device-dir\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.814185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814179 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5384178-0a8f-4c23-96ba-bcbe045f676c-dbus\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.814795 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814311 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5384178-0a8f-4c23-96ba-bcbe045f676c-dbus\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:36.814795 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.814587 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e4741ff1-3c93-48d0-8650-7f35e738b042-konnectivity-ca\") pod \"konnectivity-agent-r4l6n\" (UID: \"e4741ff1-3c93-48d0-8650-7f35e738b042\") " pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:36.815970 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.815949 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e4741ff1-3c93-48d0-8650-7f35e738b042-agent-certs\") pod \"konnectivity-agent-r4l6n\" (UID: \"e4741ff1-3c93-48d0-8650-7f35e738b042\") " pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:36.822936 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.822916 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:36.823039 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.822941 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:36.823039 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.822955 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fdd9x for pod openshift-network-diagnostics/network-check-target-v8dd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:36.823039 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:36.823030 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x podName:58df42ab-cad3-4814-9298-b1098600ccdc nodeName:}" failed. No retries permitted until 2026-04-24 23:53:37.323011682 +0000 UTC m=+3.187503970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fdd9x" (UniqueName: "kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x") pod "network-check-target-v8dd2" (UID: "58df42ab-cad3-4814-9298-b1098600ccdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:36.824704 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.824683 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ftb2\" (UniqueName: \"kubernetes.io/projected/2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0-kube-api-access-5ftb2\") pod \"iptables-alerter-cgg9j\" (UID: \"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0\") " pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.824846 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.824824 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlm28\" (UniqueName: \"kubernetes.io/projected/a7885794-c248-4c2f-8880-b5b7573c5cfc-kube-api-access-wlm28\") pod \"aws-ebs-csi-driver-node-6r7j9\" (UID: \"a7885794-c248-4c2f-8880-b5b7573c5cfc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.897290 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.897261 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:36.905167 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.905144 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-brxjn" Apr 24 23:53:36.914733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.914709 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nqwjk" Apr 24 23:53:36.919328 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.919309 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-44g68" Apr 24 23:53:36.925935 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.925916 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fs6wc" Apr 24 23:53:36.932434 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.932412 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cgg9j" Apr 24 23:53:36.939973 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.939957 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:36.945548 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.945527 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" Apr 24 23:53:36.952382 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.952362 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tdkpx"] Apr 24 23:53:36.956624 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.956609 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:36.959374 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.959353 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:36.959467 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.959353 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cv7z5\"" Apr 24 23:53:36.959837 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:36.959820 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:37.015529 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.015499 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cj9t\" (UniqueName: \"kubernetes.io/projected/07e0eebb-8365-490f-b2b2-16f26075fac7-kube-api-access-4cj9t\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.015699 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.015586 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/07e0eebb-8365-490f-b2b2-16f26075fac7-hosts-file\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.015699 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.015620 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07e0eebb-8365-490f-b2b2-16f26075fac7-tmp-dir\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.116375 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.116299 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cj9t\" (UniqueName: \"kubernetes.io/projected/07e0eebb-8365-490f-b2b2-16f26075fac7-kube-api-access-4cj9t\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.116522 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.116381 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/07e0eebb-8365-490f-b2b2-16f26075fac7-hosts-file\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.116522 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.116404 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07e0eebb-8365-490f-b2b2-16f26075fac7-tmp-dir\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.116637 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.116527 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/07e0eebb-8365-490f-b2b2-16f26075fac7-hosts-file\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.116791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.116764 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07e0eebb-8365-490f-b2b2-16f26075fac7-tmp-dir\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.124732 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.124710 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cj9t\" (UniqueName: \"kubernetes.io/projected/07e0eebb-8365-490f-b2b2-16f26075fac7-kube-api-access-4cj9t\") pod \"node-resolver-tdkpx\" (UID: \"07e0eebb-8365-490f-b2b2-16f26075fac7\") " pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.216873 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.216840 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:37.217034 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.216996 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:37.217083 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.217064 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:38.217048126 +0000 UTC m=+4.081540391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:37.265820 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.265788 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tdkpx" Apr 24 23:53:37.295987 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.295764 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdb63bec_b61b_4953_b2fc_7f06ee7063ac.slice/crio-3127fc14a31a8098bfaaf94c3ff51358774978320fce1267d39f73c787ab0276 WatchSource:0}: Error finding container 3127fc14a31a8098bfaaf94c3ff51358774978320fce1267d39f73c787ab0276: Status 404 returned error can't find the container with id 3127fc14a31a8098bfaaf94c3ff51358774978320fce1267d39f73c787ab0276 Apr 24 23:53:37.298679 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.298654 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e92e50_ca0e_4133_9e99_69897f47de51.slice/crio-dadb4785e3a7b0b7cf04758992b39a1e567a6d65cceb8192087f32249b4b4b99 WatchSource:0}: Error finding container dadb4785e3a7b0b7cf04758992b39a1e567a6d65cceb8192087f32249b4b4b99: Status 404 returned error can't find the container with id dadb4785e3a7b0b7cf04758992b39a1e567a6d65cceb8192087f32249b4b4b99 Apr 24 23:53:37.300451 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.300425 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e0eebb_8365_490f_b2b2_16f26075fac7.slice/crio-993068427fd1005cf0baa2abe2c5838139dbfc8896ab55e08f995bcadb7fb009 WatchSource:0}: Error finding container 993068427fd1005cf0baa2abe2c5838139dbfc8896ab55e08f995bcadb7fb009: Status 404 returned error can't find the container with id 993068427fd1005cf0baa2abe2c5838139dbfc8896ab55e08f995bcadb7fb009 Apr 24 23:53:37.302408 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.302361 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod349e5253_376d_444a_b099_86d3fb1b6b37.slice/crio-4c2d740e13d42ffd4c56797be08656d44e0427bd1d8a9018c28e909e5cb64a2e WatchSource:0}: Error finding container 4c2d740e13d42ffd4c56797be08656d44e0427bd1d8a9018c28e909e5cb64a2e: Status 404 returned error can't find the container with id 4c2d740e13d42ffd4c56797be08656d44e0427bd1d8a9018c28e909e5cb64a2e Apr 24 23:53:37.303347 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.303325 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4741ff1_3c93_48d0_8650_7f35e738b042.slice/crio-9996a41a16d441dce42f6307443502f06f78c8d87666b42cb41c44701cb44386 WatchSource:0}: Error finding container 9996a41a16d441dce42f6307443502f06f78c8d87666b42cb41c44701cb44386: Status 404 returned error can't find the container with id 9996a41a16d441dce42f6307443502f06f78c8d87666b42cb41c44701cb44386 Apr 24 23:53:37.318055 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.318037 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:37.318161 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.318150 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:37.318207 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.318192 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret podName:c5384178-0a8f-4c23-96ba-bcbe045f676c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:38.318179119 +0000 UTC m=+4.182671390 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret") pod "global-pull-secret-syncer-qr6ml" (UID: "c5384178-0a8f-4c23-96ba-bcbe045f676c") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:37.324089 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.324066 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4228f03_25e8_4a96_b72d_5f9fa76ee207.slice/crio-a168e494cdfca7fe634fb65a8b6fb730198f06e93f1bafd2be26772f25e0bd28 WatchSource:0}: Error finding container a168e494cdfca7fe634fb65a8b6fb730198f06e93f1bafd2be26772f25e0bd28: Status 404 returned error can't find the container with id a168e494cdfca7fe634fb65a8b6fb730198f06e93f1bafd2be26772f25e0bd28 Apr 24 23:53:37.325000 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.324977 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93b15e14_7d7c_4b19_bb25_2e48ae26af80.slice/crio-4d4d8ec5d3ddf332abb5a047ee7720a7de2dcfee7679f8427cf8ed77347feb7a WatchSource:0}: Error finding container 4d4d8ec5d3ddf332abb5a047ee7720a7de2dcfee7679f8427cf8ed77347feb7a: Status 404 returned error can't find the container with id 4d4d8ec5d3ddf332abb5a047ee7720a7de2dcfee7679f8427cf8ed77347feb7a Apr 24 23:53:37.325676 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.325646 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7885794_c248_4c2f_8880_b5b7573c5cfc.slice/crio-7b506fb197906fc7c3fd7873e70b232a1fba2bf053b0da088bf42f9f4790b594 WatchSource:0}: Error finding container 7b506fb197906fc7c3fd7873e70b232a1fba2bf053b0da088bf42f9f4790b594: Status 404 returned error can't find the container with id 7b506fb197906fc7c3fd7873e70b232a1fba2bf053b0da088bf42f9f4790b594 Apr 24 23:53:37.326594 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:53:37.326556 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2adf0675_4eb9_4ad4_8d83_22c4f20fa0a0.slice/crio-f4ee89f4f787ec56af38b3ef53f3e13e434b6ff74f0eee93a360c4149516e292 WatchSource:0}: Error finding container f4ee89f4f787ec56af38b3ef53f3e13e434b6ff74f0eee93a360c4149516e292: Status 404 returned error can't find the container with id f4ee89f4f787ec56af38b3ef53f3e13e434b6ff74f0eee93a360c4149516e292 Apr 24 23:53:37.418773 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.418749 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:37.418866 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.418854 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:37.418920 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.418869 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:37.418920 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.418878 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fdd9x for pod openshift-network-diagnostics/network-check-target-v8dd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:37.418920 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.418917 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x podName:58df42ab-cad3-4814-9298-b1098600ccdc nodeName:}" failed. No retries permitted until 2026-04-24 23:53:38.418903494 +0000 UTC m=+4.283395759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fdd9x" (UniqueName: "kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x") pod "network-check-target-v8dd2" (UID: "58df42ab-cad3-4814-9298-b1098600ccdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:37.636670 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.636471 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:35 +0000 UTC" deadline="2028-01-16 21:07:10.249507832 +0000 UTC" Apr 24 23:53:37.636670 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.636508 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15165h13m32.613003926s" Apr 24 23:53:37.726389 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.726355 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:37.726548 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.726492 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:37.727090 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.726930 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:37.727090 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:37.727027 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:37.743351 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.743313 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-brxjn" event={"ID":"349e5253-376d-444a-b099-86d3fb1b6b37","Type":"ContainerStarted","Data":"4c2d740e13d42ffd4c56797be08656d44e0427bd1d8a9018c28e909e5cb64a2e"} Apr 24 23:53:37.747447 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.747397 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nqwjk" event={"ID":"cdb63bec-b61b-4953-b2fc-7f06ee7063ac","Type":"ContainerStarted","Data":"3127fc14a31a8098bfaaf94c3ff51358774978320fce1267d39f73c787ab0276"} Apr 24 23:53:37.754025 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.753353 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" event={"ID":"4f4adde147201157e73b34681d8e0de6","Type":"ContainerStarted","Data":"f765ff873c6ac1b40b8224950ac5a14289590aff8b94b30ec9b437dc162046b5"} Apr 24 23:53:37.761674 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.761621 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cgg9j" event={"ID":"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0","Type":"ContainerStarted","Data":"f4ee89f4f787ec56af38b3ef53f3e13e434b6ff74f0eee93a360c4149516e292"} Apr 24 23:53:37.764879 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.764818 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44g68" event={"ID":"93b15e14-7d7c-4b19-bb25-2e48ae26af80","Type":"ContainerStarted","Data":"4d4d8ec5d3ddf332abb5a047ee7720a7de2dcfee7679f8427cf8ed77347feb7a"} Apr 24 23:53:37.766794 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.766721 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r4l6n" event={"ID":"e4741ff1-3c93-48d0-8650-7f35e738b042","Type":"ContainerStarted","Data":"9996a41a16d441dce42f6307443502f06f78c8d87666b42cb41c44701cb44386"} Apr 24 23:53:37.771130 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.771070 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" event={"ID":"a7885794-c248-4c2f-8880-b5b7573c5cfc","Type":"ContainerStarted","Data":"7b506fb197906fc7c3fd7873e70b232a1fba2bf053b0da088bf42f9f4790b594"} Apr 24 23:53:37.776803 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.776760 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"a168e494cdfca7fe634fb65a8b6fb730198f06e93f1bafd2be26772f25e0bd28"} Apr 24 23:53:37.781693 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.781647 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tdkpx" event={"ID":"07e0eebb-8365-490f-b2b2-16f26075fac7","Type":"ContainerStarted","Data":"993068427fd1005cf0baa2abe2c5838139dbfc8896ab55e08f995bcadb7fb009"} Apr 24 23:53:37.790721 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:37.790680 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fs6wc" event={"ID":"11e92e50-ca0e-4133-9e99-69897f47de51","Type":"ContainerStarted","Data":"dadb4785e3a7b0b7cf04758992b39a1e567a6d65cceb8192087f32249b4b4b99"} Apr 24 23:53:38.229254 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:38.228554 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:38.229254 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.228857 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:38.229254 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.228924 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:40.228904036 +0000 UTC m=+6.093396324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:38.330299 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:38.329718 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:38.330299 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.329875 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:38.330299 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.329936 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret podName:c5384178-0a8f-4c23-96ba-bcbe045f676c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:40.329917644 +0000 UTC m=+6.194409916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret") pod "global-pull-secret-syncer-qr6ml" (UID: "c5384178-0a8f-4c23-96ba-bcbe045f676c") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:38.430400 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:38.430360 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:38.430632 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.430596 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:38.430632 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.430614 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:38.430632 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.430627 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fdd9x for pod openshift-network-diagnostics/network-check-target-v8dd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:38.430813 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.430686 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x podName:58df42ab-cad3-4814-9298-b1098600ccdc nodeName:}" failed. No retries permitted until 2026-04-24 23:53:40.430665592 +0000 UTC m=+6.295157861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fdd9x" (UniqueName: "kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x") pod "network-check-target-v8dd2" (UID: "58df42ab-cad3-4814-9298-b1098600ccdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:38.729493 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:38.728990 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:38.729493 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:38.729111 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:38.805522 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:38.805435 2566 generic.go:358] "Generic (PLEG): container finished" podID="18921f8b148e5eaa8f9152ede9eff20a" containerID="03cd675bf49370fa01c4aef3ba244995c141d6dc876b397dbc8e1741a49125d0" exitCode=0 Apr 24 23:53:38.808216 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:38.806485 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" event={"ID":"18921f8b148e5eaa8f9152ede9eff20a","Type":"ContainerDied","Data":"03cd675bf49370fa01c4aef3ba244995c141d6dc876b397dbc8e1741a49125d0"} Apr 24 23:53:38.823590 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:38.823270 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-130.ec2.internal" podStartSLOduration=2.82324958 podStartE2EDuration="2.82324958s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:37.772997661 +0000 UTC m=+3.637489952" watchObservedRunningTime="2026-04-24 23:53:38.82324958 +0000 UTC m=+4.687741869" Apr 24 23:53:39.725972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:39.725943 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:39.726173 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:39.726086 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:39.726290 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:39.726271 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:39.726398 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:39.726363 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:39.825375 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:39.824703 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" event={"ID":"18921f8b148e5eaa8f9152ede9eff20a","Type":"ContainerStarted","Data":"bfe37bf239f72888292342df118164dc2963142c15ab949b29676a2a5fea71f0"} Apr 24 23:53:39.838462 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:39.838409 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-130.ec2.internal" podStartSLOduration=3.838392573 podStartE2EDuration="3.838392573s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:39.837973651 +0000 UTC m=+5.702465997" watchObservedRunningTime="2026-04-24 23:53:39.838392573 +0000 UTC m=+5.702884862" Apr 24 23:53:40.248392 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:40.248358 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:40.248623 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.248539 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:40.248623 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.248626 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.248604244 +0000 UTC m=+10.113096523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:40.348802 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:40.348769 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:40.348992 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.348965 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:40.349065 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.349044 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret podName:c5384178-0a8f-4c23-96ba-bcbe045f676c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.349023083 +0000 UTC m=+10.213515358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret") pod "global-pull-secret-syncer-qr6ml" (UID: "c5384178-0a8f-4c23-96ba-bcbe045f676c") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:40.449369 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:40.449336 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:40.449546 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.449524 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:40.449635 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.449556 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:40.449635 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.449587 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fdd9x for pod openshift-network-diagnostics/network-check-target-v8dd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:40.449737 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.449649 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x podName:58df42ab-cad3-4814-9298-b1098600ccdc nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.44963002 +0000 UTC m=+10.314122306 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fdd9x" (UniqueName: "kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x") pod "network-check-target-v8dd2" (UID: "58df42ab-cad3-4814-9298-b1098600ccdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:40.726462 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:40.726345 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:40.726635 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:40.726486 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:41.726803 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:41.726752 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:41.726803 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:41.726802 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:41.727306 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:41.726976 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:41.727306 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:41.727043 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:42.726920 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:42.726885 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:42.727393 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:42.727029 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:43.727471 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:43.726997 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:43.727471 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:43.726997 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:43.727471 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:43.727136 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:43.727471 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:43.727276 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:44.282997 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:44.282958 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:44.283172 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.283104 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.283226 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.283185 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:52.283164542 +0000 UTC m=+18.147656808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.383925 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:44.383819 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:44.384091 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.384012 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:44.384091 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.384083 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret podName:c5384178-0a8f-4c23-96ba-bcbe045f676c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:52.384064628 +0000 UTC m=+18.248556897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret") pod "global-pull-secret-syncer-qr6ml" (UID: "c5384178-0a8f-4c23-96ba-bcbe045f676c") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:44.484298 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:44.484227 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:44.484463 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.484383 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:44.484463 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.484402 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:44.484463 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.484413 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fdd9x for pod openshift-network-diagnostics/network-check-target-v8dd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.484785 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.484467 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x podName:58df42ab-cad3-4814-9298-b1098600ccdc nodeName:}" failed. No retries permitted until 2026-04-24 23:53:52.484449164 +0000 UTC m=+18.348941443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fdd9x" (UniqueName: "kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x") pod "network-check-target-v8dd2" (UID: "58df42ab-cad3-4814-9298-b1098600ccdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.728159 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:44.727675 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:44.728159 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:44.727794 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:45.726466 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:45.726430 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:45.726663 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:45.726431 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:45.726663 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:45.726557 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:45.726808 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:45.726665 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:46.726957 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:46.726870 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:46.727397 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:46.727018 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:47.726962 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:47.726926 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:47.727405 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:47.726926 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:47.727405 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:47.727079 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:47.727405 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:47.727136 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:48.726810 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:48.726782 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:48.726972 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:48.726899 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:49.726252 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:49.726222 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:49.726252 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:49.726251 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:49.726716 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:49.726350 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:49.726716 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:49.726483 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:50.729111 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:50.729072 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:50.729523 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:50.729178 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:51.726511 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:51.726474 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:51.726689 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:51.726471 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:51.726689 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:51.726635 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:51.726815 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:51.726770 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:52.345305 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:52.345263 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:52.345734 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.345396 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:52.345734 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.345458 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.345441831 +0000 UTC m=+34.209934101 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:52.445763 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:52.445730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:52.445923 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.445859 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:52.445923 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.445911 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret podName:c5384178-0a8f-4c23-96ba-bcbe045f676c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.445895525 +0000 UTC m=+34.310387796 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret") pod "global-pull-secret-syncer-qr6ml" (UID: "c5384178-0a8f-4c23-96ba-bcbe045f676c") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:52.546785 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:52.546750 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:52.546942 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.546916 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:52.546942 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.546940 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:52.547040 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.546954 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fdd9x for pod openshift-network-diagnostics/network-check-target-v8dd2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:52.547040 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.547012 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x podName:58df42ab-cad3-4814-9298-b1098600ccdc nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.546998348 +0000 UTC m=+34.411490620 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fdd9x" (UniqueName: "kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x") pod "network-check-target-v8dd2" (UID: "58df42ab-cad3-4814-9298-b1098600ccdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:52.726999 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:52.726968 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:52.727193 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:52.727097 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:53.726417 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:53.726387 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:53.726883 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:53.726387 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:53.726883 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:53.726501 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:53.726883 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:53.726555 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:54.727491 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.727214 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:54.728018 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:54.727560 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:54.850601 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.850536 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" event={"ID":"a7885794-c248-4c2f-8880-b5b7573c5cfc","Type":"ContainerStarted","Data":"11d984e1f3a4d64dde2f9e0db89b51cfba92ccdb0b97a67ed2d088ae3827e1c2"} Apr 24 23:53:54.853065 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.853042 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 24 23:53:54.853425 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.853390 2566 generic.go:358] "Generic (PLEG): container finished" podID="b4228f03-25e8-4a96-b72d-5f9fa76ee207" containerID="929b537858f4739584cf05b5b6a3f49cf6b7125fa80f6bdddaf6fa2b6e4dad88" exitCode=1 Apr 24 23:53:54.853519 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.853466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"34317df953f0d1832c7cb255cd570fdafc5fed161c78102eef76a4d78fe1db1d"} Apr 24 23:53:54.853519 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.853499 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"215dff5ec45c6cfad5766b2c3d76228f9afa0276677d9beab903944244e6e10f"} Apr 24 23:53:54.853519 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.853512 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"8000d3e04f2e32e6794f60e576e12a1eb6c2b44a7b923a743f3f2b57a7063866"} Apr 24 23:53:54.853671 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.853527 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerDied","Data":"929b537858f4739584cf05b5b6a3f49cf6b7125fa80f6bdddaf6fa2b6e4dad88"} Apr 24 23:53:54.853671 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.853543 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"65cbed014ecba4964d5a44ea99f935a19cc48ffdfbdf5aaad56e13e0013450ac"} Apr 24 23:53:54.854794 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.854767 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tdkpx" event={"ID":"07e0eebb-8365-490f-b2b2-16f26075fac7","Type":"ContainerStarted","Data":"bdff210975ac9a35a806f5514ce57744f25a4d82bfaaf865a5744860fc245494"} Apr 24 23:53:54.856216 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.856195 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fs6wc" event={"ID":"11e92e50-ca0e-4133-9e99-69897f47de51","Type":"ContainerStarted","Data":"25ec3baf48f0e5bbfb3b309c32f49df6f7d595f8994dcf0b7a70a7221fae43b2"} Apr 24 23:53:54.857482 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.857463 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-brxjn" event={"ID":"349e5253-376d-444a-b099-86d3fb1b6b37","Type":"ContainerStarted","Data":"827a136997ca6cf43840da33519fbb5716b8e619a74581efd26090a1853522a4"} Apr 24 23:53:54.858909 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.858874 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nqwjk" event={"ID":"cdb63bec-b61b-4953-b2fc-7f06ee7063ac","Type":"ContainerStarted","Data":"f1c0152e5304708ded1f46a8c5626b4ebd787507b914014174dd3db7f32bf557"} Apr 24 23:53:54.860355 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.860332 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b15e14-7d7c-4b19-bb25-2e48ae26af80" containerID="dc7977e8035976b1cf6bdd2d83ca892b97dd4c8ff9e035d4fc97ab02129112ba" exitCode=0 Apr 24 23:53:54.860447 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.860359 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44g68" event={"ID":"93b15e14-7d7c-4b19-bb25-2e48ae26af80","Type":"ContainerDied","Data":"dc7977e8035976b1cf6bdd2d83ca892b97dd4c8ff9e035d4fc97ab02129112ba"} Apr 24 23:53:54.861875 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.861799 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r4l6n" event={"ID":"e4741ff1-3c93-48d0-8650-7f35e738b042","Type":"ContainerStarted","Data":"40b2dadeeecf2a14c1910bdf3651a17e14957bef087cf72bf08d6daf68773f03"} Apr 24 23:53:54.868151 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.868107 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tdkpx" podStartSLOduration=2.252053199 podStartE2EDuration="18.868092916s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.30190551 +0000 UTC m=+3.166397778" lastFinishedPulling="2026-04-24 23:53:53.917945223 +0000 UTC m=+19.782437495" observedRunningTime="2026-04-24 23:53:54.867754977 +0000 UTC m=+20.732247266" watchObservedRunningTime="2026-04-24 23:53:54.868092916 +0000 UTC m=+20.732585205" Apr 24 23:53:54.879478 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.879444 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nqwjk" podStartSLOduration=11.924196241 podStartE2EDuration="20.879433538s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.300002688 +0000 UTC m=+3.164494954" lastFinishedPulling="2026-04-24 23:53:46.255239969 +0000 UTC m=+12.119732251" observedRunningTime="2026-04-24 23:53:54.879129052 +0000 UTC m=+20.743621350" watchObservedRunningTime="2026-04-24 23:53:54.879433538 +0000 UTC m=+20.743925828" Apr 24 23:53:54.890724 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.890689 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-r4l6n" podStartSLOduration=4.295792174 podStartE2EDuration="20.890679945s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.323129764 +0000 UTC m=+3.187622029" lastFinishedPulling="2026-04-24 23:53:53.918017533 +0000 UTC m=+19.782509800" observedRunningTime="2026-04-24 23:53:54.890225344 +0000 UTC m=+20.754717633" watchObservedRunningTime="2026-04-24 23:53:54.890679945 +0000 UTC m=+20.755172233" Apr 24 23:53:54.903034 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.902991 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-brxjn" podStartSLOduration=4.298074089 podStartE2EDuration="20.902976039s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.323264837 +0000 UTC m=+3.187757109" lastFinishedPulling="2026-04-24 23:53:53.928166775 +0000 UTC m=+19.792659059" observedRunningTime="2026-04-24 23:53:54.902370589 +0000 UTC m=+20.766862876" watchObservedRunningTime="2026-04-24 23:53:54.902976039 +0000 UTC m=+20.767468328" Apr 24 23:53:54.951330 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:54.951278 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fs6wc" podStartSLOduration=4.282228462 podStartE2EDuration="20.95126145s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.30118949 +0000 UTC m=+3.165681759" lastFinishedPulling="2026-04-24 23:53:53.970222464 +0000 UTC m=+19.834714747" observedRunningTime="2026-04-24 23:53:54.950892881 +0000 UTC m=+20.815385185" watchObservedRunningTime="2026-04-24 23:53:54.95126145 +0000 UTC m=+20.815753738" Apr 24 23:53:55.152170 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.152144 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:55.152712 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.152697 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:55.174241 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.174213 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:53:55.660181 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.660022 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:53:55.174236826Z","UUID":"5a63ae7c-f6a5-4757-b3ac-612bc9131c91","Handler":null,"Name":"","Endpoint":""} Apr 24 23:53:55.662129 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.662105 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:53:55.662245 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.662137 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:53:55.726967 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.726939 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:55.727115 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.726952 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:55.727115 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:55.727078 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:55.727229 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:55.727171 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:55.866280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.866238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" event={"ID":"a7885794-c248-4c2f-8880-b5b7573c5cfc","Type":"ContainerStarted","Data":"3e709074c8caf11c6a2c038c1fe7c540602fc1f0d4e855f627c631df0149cafa"} Apr 24 23:53:55.869425 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.869402 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 24 23:53:55.869871 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.869838 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"affc42a1c49ec931ec97a2fd1a068239d4f840156b488d054b01e57c3a5c9579"} Apr 24 23:53:55.871506 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:55.871484 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cgg9j" event={"ID":"2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0","Type":"ContainerStarted","Data":"1fdb954e8b7d6aadc88c742882e472eeea10fb4ead68239078999310b71b14a4"} Apr 24 23:53:56.726868 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:56.726652 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:56.727094 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:56.726979 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:56.878497 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:56.878449 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" event={"ID":"a7885794-c248-4c2f-8880-b5b7573c5cfc","Type":"ContainerStarted","Data":"4e542661399db6a7227a7af92251b3323b1a75c06dad830593caf152e72c93d2"} Apr 24 23:53:56.878497 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:56.878495 2566 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:53:56.893707 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:56.893645 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cgg9j" podStartSLOduration=6.305435817 podStartE2EDuration="22.893629153s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.329758097 +0000 UTC m=+3.194250363" lastFinishedPulling="2026-04-24 23:53:53.917951418 +0000 UTC m=+19.782443699" observedRunningTime="2026-04-24 23:53:55.906681335 +0000 UTC m=+21.771173624" watchObservedRunningTime="2026-04-24 23:53:56.893629153 +0000 UTC m=+22.758121442" Apr 24 23:53:56.893875 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:56.893812 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6r7j9" podStartSLOduration=3.928112027 podStartE2EDuration="22.893806295s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.330012418 +0000 UTC m=+3.194504685" lastFinishedPulling="2026-04-24 23:53:56.295706678 +0000 UTC m=+22.160198953" observedRunningTime="2026-04-24 23:53:56.89316613 +0000 UTC m=+22.757658419" watchObservedRunningTime="2026-04-24 23:53:56.893806295 +0000 UTC m=+22.758298615" Apr 24 23:53:57.297875 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:57.297839 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:57.298538 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:57.298503 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-r4l6n" Apr 24 23:53:57.726283 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:57.726249 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:57.726480 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:57.726249 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:57.726480 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:57.726390 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:57.726480 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:57.726423 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:57.885055 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:57.885033 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 24 23:53:57.885672 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:57.885477 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"13fdff223a966aeb45d93a7f7372fa6445d4de52059ca5105f8ac2eb2c87be78"} Apr 24 23:53:58.726457 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:58.726419 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:53:58.726639 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:58.726557 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:53:59.726562 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.726386 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:53:59.727234 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.726386 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:53:59.727234 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:59.726686 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:53:59.727234 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:53:59.726781 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:53:59.891115 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.891073 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b15e14-7d7c-4b19-bb25-2e48ae26af80" containerID="da423320263063af895e50861220bb73597928522317d5c09ad886f0abc2f953" exitCode=0 Apr 24 23:53:59.891275 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.891161 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44g68" event={"ID":"93b15e14-7d7c-4b19-bb25-2e48ae26af80","Type":"ContainerDied","Data":"da423320263063af895e50861220bb73597928522317d5c09ad886f0abc2f953"} Apr 24 23:53:59.894158 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.894144 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 24 23:53:59.894469 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.894447 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"01225cd8e0b32b027c0cf04f765806bd0f4842f442fe09476a2d3726240659ee"} Apr 24 23:53:59.894759 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.894740 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:59.894856 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.894763 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:59.894906 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.894892 2566 scope.go:117] "RemoveContainer" containerID="929b537858f4739584cf05b5b6a3f49cf6b7125fa80f6bdddaf6fa2b6e4dad88" Apr 24 23:53:59.909459 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.909442 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:53:59.910267 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:53:59.910204 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:54:00.726543 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:00.726511 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:00.726858 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:00.726661 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:54:00.899281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:00.899210 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 24 23:54:00.899617 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:00.899594 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" event={"ID":"b4228f03-25e8-4a96-b72d-5f9fa76ee207","Type":"ContainerStarted","Data":"06a062f556efab3a6249ee2f5222020971c6e63d115515864f46d3c03b7833e2"} Apr 24 23:54:00.899730 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:00.899715 2566 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:54:00.901560 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:00.901536 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b15e14-7d7c-4b19-bb25-2e48ae26af80" containerID="79f5c04c445e56f6a6570773ad7845698a23dd226fa52378c0a0f07082047d6a" exitCode=0 Apr 24 23:54:00.901711 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:00.901587 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44g68" event={"ID":"93b15e14-7d7c-4b19-bb25-2e48ae26af80","Type":"ContainerDied","Data":"79f5c04c445e56f6a6570773ad7845698a23dd226fa52378c0a0f07082047d6a"} Apr 24 23:54:00.934550 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:00.934506 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" podStartSLOduration=10.075129205 podStartE2EDuration="26.934491431s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.326080289 +0000 UTC m=+3.190572560" lastFinishedPulling="2026-04-24 23:53:54.185442515 +0000 UTC m=+20.049934786" observedRunningTime="2026-04-24 23:54:00.934289513 +0000 UTC m=+26.798781801" watchObservedRunningTime="2026-04-24 23:54:00.934491431 +0000 UTC m=+26.798983750" Apr 24 23:54:01.451953 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.451924 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qr6ml"] Apr 24 23:54:01.452162 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.452055 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:01.452208 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:01.452157 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:54:01.456621 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.456589 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wf82j"] Apr 24 23:54:01.456777 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.456730 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:54:01.456877 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:01.456854 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:54:01.459332 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.459310 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v8dd2"] Apr 24 23:54:01.459490 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.459382 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:01.459490 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:01.459474 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:54:01.906033 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.905953 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b15e14-7d7c-4b19-bb25-2e48ae26af80" containerID="3347a877e67a4d688fa60134221592b924e1371b0dc4a0f5aac010531bd1581f" exitCode=0 Apr 24 23:54:01.906367 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.906036 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44g68" event={"ID":"93b15e14-7d7c-4b19-bb25-2e48ae26af80","Type":"ContainerDied","Data":"3347a877e67a4d688fa60134221592b924e1371b0dc4a0f5aac010531bd1581f"} Apr 24 23:54:01.906367 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:01.906175 2566 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:54:02.332375 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:02.332199 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:54:02.729145 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:02.729116 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:54:02.729145 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:02.729133 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:02.729353 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:02.729235 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:54:02.729401 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:02.729364 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:54:03.726269 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:03.726196 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:03.726774 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:03.726333 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:54:04.726983 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:04.726943 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:04.727708 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:04.727057 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:54:04.727708 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:04.727115 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:54:04.727708 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:04.727354 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:54:05.726171 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:05.726131 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:05.726348 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:05.726263 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v8dd2" podUID="58df42ab-cad3-4814-9298-b1098600ccdc" Apr 24 23:54:06.726809 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:06.726747 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:06.727292 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:06.726881 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qr6ml" podUID="c5384178-0a8f-4c23-96ba-bcbe045f676c" Apr 24 23:54:06.727292 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:06.726949 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:54:06.727292 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:06.727082 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:54:06.987878 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:06.987850 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-130.ec2.internal" event="NodeReady" Apr 24 23:54:06.988063 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:06.987991 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:07.021899 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.021864 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-t7llw"] Apr 24 23:54:07.052138 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.052108 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-785f97fdcb-rww6s"] Apr 24 23:54:07.052298 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.052273 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:07.056005 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.055977 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 23:54:07.056146 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.056029 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 23:54:07.056335 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.056316 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-jxmc2\"" Apr 24 23:54:07.067233 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.067205 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-t7llw"] Apr 24 23:54:07.067233 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.067234 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-785f97fdcb-rww6s"] Apr 24 23:54:07.067661 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.067252 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v7wqx"] Apr 24 23:54:07.067661 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.067312 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.069853 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.069797 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:54:07.069965 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.069812 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpz6k\"" Apr 24 23:54:07.070033 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.069971 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:54:07.070033 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.069997 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:54:07.074720 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.074700 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:54:07.085428 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.085404 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v7wqx"] Apr 24 23:54:07.085585 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.085513 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:07.088149 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.088123 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:07.088149 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.088141 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:07.088291 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.088179 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gqd2q\"" Apr 24 23:54:07.088446 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.088430 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:07.143609 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.143581 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xcntc"] Apr 24 23:54:07.152828 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.152799 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.152964 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.152837 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1bb33cce-974b-42c1-aafe-f821da1a3f63-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:07.152964 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.152941 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e290ed95-0b8a-4c01-aa97-91f4caed9f63-ca-trust-extracted\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.153071 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.152985 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:07.153071 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.153054 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-trusted-ca\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.153173 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.153083 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-bound-sa-token\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.153227 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.153176 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-image-registry-private-configuration\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.153285 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.153242 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-installation-pull-secrets\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.153490 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.153327 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v99tr\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-kube-api-access-v99tr\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.153490 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.153392 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-certificates\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.157186 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.157166 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xcntc"] Apr 24 23:54:07.157304 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.157285 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.159783 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.159621 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:07.159783 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.159746 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:07.159908 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.159788 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-drk7t\"" Apr 24 23:54:07.254164 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254088 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:07.254164 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254132 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:07.254423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254177 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-trusted-ca\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.254423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254198 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-bound-sa-token\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.254423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-config-volume\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.254423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254237 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-image-registry-private-configuration\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.254423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254256 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-installation-pull-secrets\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.254423 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.254257 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:07.254423 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.254335 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.754318202 +0000 UTC m=+33.618810476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:54:07.254423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254399 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v99tr\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-kube-api-access-v99tr\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254450 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-certificates\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254522 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254557 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-tmp-dir\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254634 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1bb33cce-974b-42c1-aafe-f821da1a3f63-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254671 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e290ed95-0b8a-4c01-aa97-91f4caed9f63-ca-trust-extracted\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254703 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blqcm\" (UniqueName: \"kubernetes.io/projected/7fe18020-a109-4021-a4a7-567311f209f4-kube-api-access-blqcm\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.254728 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8sh9\" (UniqueName: \"kubernetes.io/projected/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-kube-api-access-d8sh9\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.254779 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.254795 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:54:07.254833 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.254842 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.754824346 +0000 UTC m=+33.619316627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:54:07.255293 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.255003 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-certificates\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.255293 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.255101 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e290ed95-0b8a-4c01-aa97-91f4caed9f63-ca-trust-extracted\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.255293 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.255247 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1bb33cce-974b-42c1-aafe-f821da1a3f63-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:07.255293 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.255247 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-trusted-ca\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.270124 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.270101 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-image-registry-private-configuration\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.270124 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.270113 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-installation-pull-secrets\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.271811 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.271791 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-bound-sa-token\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.271944 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.271926 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v99tr\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-kube-api-access-v99tr\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.356047 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.356012 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.356202 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.356070 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-tmp-dir\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.356202 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.356092 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blqcm\" (UniqueName: \"kubernetes.io/projected/7fe18020-a109-4021-a4a7-567311f209f4-kube-api-access-blqcm\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:07.356202 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.356109 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8sh9\" (UniqueName: \"kubernetes.io/projected/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-kube-api-access-d8sh9\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.356202 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.356143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:07.356202 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.356175 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:07.356202 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.356196 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-config-volume\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.356453 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.356252 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.856230526 +0000 UTC m=+33.720722792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:07.356453 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.356300 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:07.356453 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.356332 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-tmp-dir\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.356453 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.356371 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.856353393 +0000 UTC m=+33.720845669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:54:07.359037 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.359020 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-config-volume\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.364610 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.364589 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8sh9\" (UniqueName: \"kubernetes.io/projected/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-kube-api-access-d8sh9\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.365085 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.365067 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blqcm\" (UniqueName: \"kubernetes.io/projected/7fe18020-a109-4021-a4a7-567311f209f4-kube-api-access-blqcm\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:07.726274 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.726093 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:07.728931 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.728900 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:07.729382 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.728964 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:07.729382 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.728972 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjkj6\"" Apr 24 23:54:07.758868 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.758823 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:07.759040 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.758889 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:07.759040 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.758971 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:07.759040 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.758984 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:54:07.759040 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.759028 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:07.759259 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.759033 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.759019377 +0000 UTC m=+34.623511648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:54:07.759259 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.759084 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.759069942 +0000 UTC m=+34.623562208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:54:07.859665 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.859632 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:07.859799 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.859715 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:07.859799 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.859768 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:07.859893 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.859830 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.859814561 +0000 UTC m=+34.724306832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:07.859893 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.859848 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:07.859973 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:07.859895 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.859880718 +0000 UTC m=+34.724372984 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:54:07.920125 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.920095 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b15e14-7d7c-4b19-bb25-2e48ae26af80" containerID="207a303cf8301fe300be38ce91c07a0af15d7dc9c58a13de9101350973921526" exitCode=0 Apr 24 23:54:07.920253 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:07.920130 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44g68" event={"ID":"93b15e14-7d7c-4b19-bb25-2e48ae26af80","Type":"ContainerDied","Data":"207a303cf8301fe300be38ce91c07a0af15d7dc9c58a13de9101350973921526"} Apr 24 23:54:08.365143 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.365057 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:54:08.365316 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.365227 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:08.365316 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.365307 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:40.365288454 +0000 UTC m=+66.229780734 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:08.466271 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.466241 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:08.466426 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.466359 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:08.466426 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.466408 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret podName:c5384178-0a8f-4c23-96ba-bcbe045f676c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:40.466394743 +0000 UTC m=+66.330887008 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret") pod "global-pull-secret-syncer-qr6ml" (UID: "c5384178-0a8f-4c23-96ba-bcbe045f676c") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:08.566964 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.566921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:08.569975 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.569953 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdd9x\" (UniqueName: \"kubernetes.io/projected/58df42ab-cad3-4814-9298-b1098600ccdc-kube-api-access-fdd9x\") pod \"network-check-target-v8dd2\" (UID: \"58df42ab-cad3-4814-9298-b1098600ccdc\") " pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:08.635334 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.635255 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:08.726891 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.726672 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:08.727009 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.726896 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:54:08.730033 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.730013 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:08.731862 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.731840 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:08.732244 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.732077 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qtxb2\"" Apr 24 23:54:08.768087 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.768048 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:08.768200 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.768165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:08.768300 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.768281 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:08.768300 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.768303 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:54:08.768445 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.768349 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:10.768335395 +0000 UTC m=+36.632827661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:54:08.768445 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.768279 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:08.768445 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.768423 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:10.768410566 +0000 UTC m=+36.632902841 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:54:08.831146 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.831112 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v8dd2"] Apr 24 23:54:08.836833 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:54:08.836809 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58df42ab_cad3_4814_9298_b1098600ccdc.slice/crio-a1793baf3c43e58e629d73c9bc4c4454e065cf457fd9a75e37a0e71a1398cb66 WatchSource:0}: Error finding container a1793baf3c43e58e629d73c9bc4c4454e065cf457fd9a75e37a0e71a1398cb66: Status 404 returned error can't find the container with id a1793baf3c43e58e629d73c9bc4c4454e065cf457fd9a75e37a0e71a1398cb66 Apr 24 23:54:08.868439 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.868421 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:08.868537 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.868491 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:08.868604 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.868554 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:08.868604 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.868589 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:08.868677 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.868626 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:10.868610095 +0000 UTC m=+36.733102361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:54:08.868677 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:08.868639 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:10.868633401 +0000 UTC m=+36.733125667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:08.924645 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.924585 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b15e14-7d7c-4b19-bb25-2e48ae26af80" containerID="7910683e75ee7c879b4c10fa86d92bf0c4433e040d4f6cd1dc8395cfd86b57f4" exitCode=0 Apr 24 23:54:08.924645 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.924607 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44g68" event={"ID":"93b15e14-7d7c-4b19-bb25-2e48ae26af80","Type":"ContainerDied","Data":"7910683e75ee7c879b4c10fa86d92bf0c4433e040d4f6cd1dc8395cfd86b57f4"} Apr 24 23:54:08.925692 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:08.925671 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v8dd2" event={"ID":"58df42ab-cad3-4814-9298-b1098600ccdc","Type":"ContainerStarted","Data":"a1793baf3c43e58e629d73c9bc4c4454e065cf457fd9a75e37a0e71a1398cb66"} Apr 24 23:54:09.931410 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:09.931241 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44g68" event={"ID":"93b15e14-7d7c-4b19-bb25-2e48ae26af80","Type":"ContainerStarted","Data":"09e97521273e3a122a2eb081f324a83f94b363e47ab3d9c645e74928c8ca0c2b"} Apr 24 23:54:09.954856 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:09.954771 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-44g68" podStartSLOduration=5.759795296 podStartE2EDuration="35.954752615s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:53:37.329911352 +0000 UTC m=+3.194403618" lastFinishedPulling="2026-04-24 23:54:07.524868664 +0000 UTC m=+33.389360937" observedRunningTime="2026-04-24 23:54:09.952718906 +0000 UTC m=+35.817211188" watchObservedRunningTime="2026-04-24 23:54:09.954752615 +0000 UTC m=+35.819244905" Apr 24 23:54:10.783364 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:10.783319 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:10.783590 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:10.783439 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:10.783590 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.783476 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:10.783590 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.783499 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:54:10.783590 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.783556 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:10.783590 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.783589 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.783550897 +0000 UTC m=+40.648043163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:54:10.783830 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.783628 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.783611239 +0000 UTC m=+40.648103507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:54:10.884353 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:10.884320 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:10.884519 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:10.884399 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:10.884519 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.884476 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:10.884648 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.884544 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.884527543 +0000 UTC m=+40.749019815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:10.884648 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.884580 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:10.884744 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:10.884651 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.884631951 +0000 UTC m=+40.749124223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:54:12.938444 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:12.938409 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v8dd2" event={"ID":"58df42ab-cad3-4814-9298-b1098600ccdc","Type":"ContainerStarted","Data":"cda651c9a152227411d4650d0574baa560e7ab729a64c5e9b31c61cbd6091a4f"} Apr 24 23:54:12.939011 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:12.938547 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:12.956609 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:12.956548 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-v8dd2" podStartSLOduration=35.881669158 podStartE2EDuration="38.956533566s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:54:08.838538059 +0000 UTC m=+34.703030325" lastFinishedPulling="2026-04-24 23:54:11.913402467 +0000 UTC m=+37.777894733" observedRunningTime="2026-04-24 23:54:12.955340086 +0000 UTC m=+38.819832373" watchObservedRunningTime="2026-04-24 23:54:12.956533566 +0000 UTC m=+38.821025853" Apr 24 23:54:14.816845 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:14.816793 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:14.817311 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:14.816969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:14.817311 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.816988 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:14.817311 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.817012 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:54:14.817311 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.817077 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.817056611 +0000 UTC m=+48.681548891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:54:14.817311 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.817080 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:14.817311 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.817118 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.817107125 +0000 UTC m=+48.681599394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:54:14.917900 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:14.917867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:14.918052 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:14.917968 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:14.918052 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.918010 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:14.918130 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.918066 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:14.918130 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.918078 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.918057311 +0000 UTC m=+48.782549579 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:54:14.918130 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:14.918103 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.918092828 +0000 UTC m=+48.782585108 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:22.873926 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:22.873884 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:22.874385 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:22.873980 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:22.874385 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.874043 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:22.874385 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.874081 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:22.874385 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.874095 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:54:22.874385 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.874107 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:38.874090811 +0000 UTC m=+64.738583084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:54:22.874385 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.874135 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:38.874122817 +0000 UTC m=+64.738615083 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:54:22.974756 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:22.974725 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:22.974899 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:22.974798 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:22.974899 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.974863 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:22.974899 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.974891 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:22.974997 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.974915 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:38.974901868 +0000 UTC m=+64.839394138 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:54:22.974997 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:22.974935 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:38.974924461 +0000 UTC m=+64.839416726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:32.918474 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:32.918444 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7lm4" Apr 24 23:54:38.893799 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:38.893758 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:54:38.894180 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:38.893818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:54:38.894180 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.893915 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:38.894180 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.893922 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:38.894180 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.893940 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:54:38.894180 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.893979 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:10.893965822 +0000 UTC m=+96.758458088 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:54:38.894180 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.893992 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:10.89398593 +0000 UTC m=+96.758478196 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:54:38.994845 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:38.994815 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:54:38.995005 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:38.994864 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:54:38.995005 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.994949 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:38.995005 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.994990 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:10.994978523 +0000 UTC m=+96.859470795 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:54:38.995108 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.994950 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:38.995108 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:38.995080 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:55:10.995066156 +0000 UTC m=+96.859558431 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:40.406351 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:40.406298 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:54:40.410007 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:40.409990 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:40.417149 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:40.417131 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:54:40.417211 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:54:40.417201 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:44.417181538 +0000 UTC m=+130.281673817 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : secret "metrics-daemon-secret" not found Apr 24 23:54:40.507664 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:40.507630 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:40.510155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:40.510129 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:40.521882 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:40.521858 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5384178-0a8f-4c23-96ba-bcbe045f676c-original-pull-secret\") pod \"global-pull-secret-syncer-qr6ml\" (UID: \"c5384178-0a8f-4c23-96ba-bcbe045f676c\") " pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:40.539997 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:40.539975 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qr6ml" Apr 24 23:54:40.653686 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:40.653654 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qr6ml"] Apr 24 23:54:40.657558 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:54:40.657496 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5384178_0a8f_4c23_96ba_bcbe045f676c.slice/crio-0ac9d120a5db0b6c8bf8b5a1db42a400ff393cb059da27e6e87e2edd47eda919 WatchSource:0}: Error finding container 0ac9d120a5db0b6c8bf8b5a1db42a400ff393cb059da27e6e87e2edd47eda919: Status 404 returned error can't find the container with id 0ac9d120a5db0b6c8bf8b5a1db42a400ff393cb059da27e6e87e2edd47eda919 Apr 24 23:54:40.991347 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:40.991310 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qr6ml" event={"ID":"c5384178-0a8f-4c23-96ba-bcbe045f676c","Type":"ContainerStarted","Data":"0ac9d120a5db0b6c8bf8b5a1db42a400ff393cb059da27e6e87e2edd47eda919"} Apr 24 23:54:43.943004 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:43.942970 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-v8dd2" Apr 24 23:54:45.000604 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:45.000552 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qr6ml" event={"ID":"c5384178-0a8f-4c23-96ba-bcbe045f676c","Type":"ContainerStarted","Data":"e96b9d97edfde50b21c8ef1722d6ad7588bc92e61820e5bd60be1116cb913284"} Apr 24 23:54:45.022261 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:54:45.022214 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qr6ml" podStartSLOduration=65.670917007 podStartE2EDuration="1m9.02220158s" podCreationTimestamp="2026-04-24 23:53:36 +0000 UTC" firstStartedPulling="2026-04-24 23:54:40.659109511 +0000 UTC m=+66.523601778" lastFinishedPulling="2026-04-24 23:54:44.010394073 +0000 UTC m=+69.874886351" observedRunningTime="2026-04-24 23:54:45.021731189 +0000 UTC m=+70.886223478" watchObservedRunningTime="2026-04-24 23:54:45.02220158 +0000 UTC m=+70.886693867" Apr 24 23:55:10.940355 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:55:10.940320 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:55:10.940764 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:55:10.940387 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:55:10.940764 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:10.940494 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:55:10.940764 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:10.940505 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:55:10.940764 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:10.940513 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:55:10.940764 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:10.940605 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:14.940553121 +0000 UTC m=+160.805045399 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:55:10.940764 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:10.940631 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:14.94061855 +0000 UTC m=+160.805110830 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:55:11.041301 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:55:11.041266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:55:11.041448 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:55:11.041351 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:55:11.041448 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:11.041395 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:55:11.041448 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:11.041441 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:55:11.041541 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:11.041460 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:15.041447153 +0000 UTC m=+160.905939420 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:55:11.041541 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:11.041520 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:15.041509292 +0000 UTC m=+160.906001557 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:55:44.484104 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:55:44.484062 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:55:44.484608 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:44.484233 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:55:44.484608 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:55:44.484331 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs podName:e101d25b-89b6-4522-8e39-35b94ce4d935 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:46.484314187 +0000 UTC m=+252.348806458 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs") pod "network-metrics-daemon-wf82j" (UID: "e101d25b-89b6-4522-8e39-35b94ce4d935") : secret "metrics-daemon-secret" not found Apr 24 23:55:59.996498 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:55:59.996463 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5"] Apr 24 23:55:59.999138 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:55:59.999118 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5" Apr 24 23:55:59.999491 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:55:59.999469 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5tx5"] Apr 24 23:56:00.001379 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.001354 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 23:56:00.001494 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.001354 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:56:00.002104 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.002089 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.002280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.002257 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-wrvwh\"" Apr 24 23:56:00.004498 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.004439 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 23:56:00.004600 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.004499 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2n65l\"" Apr 24 23:56:00.004600 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.004515 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:56:00.004715 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.004699 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:56:00.004769 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.004753 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 23:56:00.007074 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.007052 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5"] Apr 24 23:56:00.011982 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.011960 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5tx5"] Apr 24 23:56:00.012589 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.012537 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 23:56:00.099924 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.099872 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-c997485db-f7mxn"] Apr 24 23:56:00.102791 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.102770 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.103057 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.103028 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1d003339-504a-4e95-aba4-a47bafe0f0d6-snapshots\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.103170 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.103068 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d003339-504a-4e95-aba4-a47bafe0f0d6-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.103170 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.103115 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rr8\" (UniqueName: \"kubernetes.io/projected/0953ba64-8811-4f3a-a698-e921f97eab59-kube-api-access-j2rr8\") pod \"volume-data-source-validator-7c6cbb6c87-l6dz5\" (UID: \"0953ba64-8811-4f3a-a698-e921f97eab59\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5" Apr 24 23:56:00.103257 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.103194 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d003339-504a-4e95-aba4-a47bafe0f0d6-tmp\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.103257 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.103230 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d003339-504a-4e95-aba4-a47bafe0f0d6-serving-cert\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.103257 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.103249 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mz95\" (UniqueName: \"kubernetes.io/projected/1d003339-504a-4e95-aba4-a47bafe0f0d6-kube-api-access-4mz95\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.103395 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.103304 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d003339-504a-4e95-aba4-a47bafe0f0d6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.105163 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.105143 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 23:56:00.105163 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.105159 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 23:56:00.105401 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.105373 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 23:56:00.105449 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.105429 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sw76l\"" Apr 24 23:56:00.105694 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.105678 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 23:56:00.105786 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.105721 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 23:56:00.105786 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.105724 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 23:56:00.111921 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.111902 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-c997485db-f7mxn"] Apr 24 23:56:00.204497 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mz95\" (UniqueName: \"kubernetes.io/projected/1d003339-504a-4e95-aba4-a47bafe0f0d6-kube-api-access-4mz95\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.204716 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204517 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.204716 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d003339-504a-4e95-aba4-a47bafe0f0d6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.204716 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204614 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-stats-auth\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.204716 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d003339-504a-4e95-aba4-a47bafe0f0d6-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.204716 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204711 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csx55\" (UniqueName: \"kubernetes.io/projected/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-kube-api-access-csx55\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.204988 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204784 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1d003339-504a-4e95-aba4-a47bafe0f0d6-snapshots\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.204988 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204820 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rr8\" (UniqueName: \"kubernetes.io/projected/0953ba64-8811-4f3a-a698-e921f97eab59-kube-api-access-j2rr8\") pod \"volume-data-source-validator-7c6cbb6c87-l6dz5\" (UID: \"0953ba64-8811-4f3a-a698-e921f97eab59\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5" Apr 24 23:56:00.204988 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204852 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-default-certificate\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.204988 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204877 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.204988 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204910 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d003339-504a-4e95-aba4-a47bafe0f0d6-tmp\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.204988 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.204960 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d003339-504a-4e95-aba4-a47bafe0f0d6-serving-cert\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.205306 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.205277 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d003339-504a-4e95-aba4-a47bafe0f0d6-tmp\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.205413 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.205313 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1d003339-504a-4e95-aba4-a47bafe0f0d6-snapshots\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.205654 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.205634 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d003339-504a-4e95-aba4-a47bafe0f0d6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.205776 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.205759 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d003339-504a-4e95-aba4-a47bafe0f0d6-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.207168 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.207151 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d003339-504a-4e95-aba4-a47bafe0f0d6-serving-cert\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.212883 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.212864 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mz95\" (UniqueName: \"kubernetes.io/projected/1d003339-504a-4e95-aba4-a47bafe0f0d6-kube-api-access-4mz95\") pod \"insights-operator-585dfdc468-s5tx5\" (UID: \"1d003339-504a-4e95-aba4-a47bafe0f0d6\") " pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.213031 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.213013 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rr8\" (UniqueName: \"kubernetes.io/projected/0953ba64-8811-4f3a-a698-e921f97eab59-kube-api-access-j2rr8\") pod \"volume-data-source-validator-7c6cbb6c87-l6dz5\" (UID: \"0953ba64-8811-4f3a-a698-e921f97eab59\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5" Apr 24 23:56:00.305634 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.305538 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.305634 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.305597 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-stats-auth\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.305799 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:00.305688 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:56:00.305799 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:00.305761 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:00.805743174 +0000 UTC m=+146.670235453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : secret "router-metrics-certs-default" not found Apr 24 23:56:00.305925 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.305901 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csx55\" (UniqueName: \"kubernetes.io/projected/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-kube-api-access-csx55\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.306051 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.305991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-default-certificate\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.306051 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.306022 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.306158 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:00.306140 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:00.806128901 +0000 UTC m=+146.670621172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : configmap references non-existent config key: service-ca.crt Apr 24 23:56:00.307950 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.307930 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-stats-auth\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.308163 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.308144 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-default-certificate\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.311012 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.310995 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5" Apr 24 23:56:00.316471 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.316452 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csx55\" (UniqueName: \"kubernetes.io/projected/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-kube-api-access-csx55\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.318181 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.318165 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-s5tx5" Apr 24 23:56:00.431130 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.431101 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5"] Apr 24 23:56:00.434468 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:00.434437 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0953ba64_8811_4f3a_a698_e921f97eab59.slice/crio-02ed1f861d1f9382f39105bff459d9092b88c9284e5553df3c5133c885c1eaf9 WatchSource:0}: Error finding container 02ed1f861d1f9382f39105bff459d9092b88c9284e5553df3c5133c885c1eaf9: Status 404 returned error can't find the container with id 02ed1f861d1f9382f39105bff459d9092b88c9284e5553df3c5133c885c1eaf9 Apr 24 23:56:00.446783 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.446761 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5tx5"] Apr 24 23:56:00.449971 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:00.449943 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d003339_504a_4e95_aba4_a47bafe0f0d6.slice/crio-213cb88a6cf0043b56271ace0d6995f5ac7e67f74b60d725da6052240e550174 WatchSource:0}: Error finding container 213cb88a6cf0043b56271ace0d6995f5ac7e67f74b60d725da6052240e550174: Status 404 returned error can't find the container with id 213cb88a6cf0043b56271ace0d6995f5ac7e67f74b60d725da6052240e550174 Apr 24 23:56:00.809823 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.809781 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.809823 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:00.809828 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:00.810017 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:00.809937 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:56:00.810017 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:00.809962 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.809940049 +0000 UTC m=+147.674432326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : configmap references non-existent config key: service-ca.crt Apr 24 23:56:00.810017 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:00.809988 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.809975841 +0000 UTC m=+147.674468107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : secret "router-metrics-certs-default" not found Apr 24 23:56:01.145964 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:01.145849 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5" event={"ID":"0953ba64-8811-4f3a-a698-e921f97eab59","Type":"ContainerStarted","Data":"02ed1f861d1f9382f39105bff459d9092b88c9284e5553df3c5133c885c1eaf9"} Apr 24 23:56:01.147137 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:01.147104 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5tx5" event={"ID":"1d003339-504a-4e95-aba4-a47bafe0f0d6","Type":"ContainerStarted","Data":"213cb88a6cf0043b56271ace0d6995f5ac7e67f74b60d725da6052240e550174"} Apr 24 23:56:01.819481 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:01.819443 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:01.819666 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:01.819622 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:56:01.819752 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:01.819679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:01.819752 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:01.819700 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:03.819676038 +0000 UTC m=+149.684168320 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : secret "router-metrics-certs-default" not found Apr 24 23:56:01.819870 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:01.819783 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:03.819766747 +0000 UTC m=+149.684259045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : configmap references non-existent config key: service-ca.crt Apr 24 23:56:02.150279 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:02.150247 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5" event={"ID":"0953ba64-8811-4f3a-a698-e921f97eab59","Type":"ContainerStarted","Data":"da10fe77d32a9b9bb55a36cdfdd5e231b0c114f46d61d1c902f0b8259d84a2f4"} Apr 24 23:56:02.167842 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:02.167792 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l6dz5" podStartSLOduration=2.015215039 podStartE2EDuration="3.167772198s" podCreationTimestamp="2026-04-24 23:55:59 +0000 UTC" firstStartedPulling="2026-04-24 23:56:00.436289046 +0000 UTC m=+146.300781313" lastFinishedPulling="2026-04-24 23:56:01.588846202 +0000 UTC m=+147.453338472" observedRunningTime="2026-04-24 23:56:02.16770561 +0000 UTC m=+148.032197896" watchObservedRunningTime="2026-04-24 23:56:02.167772198 +0000 UTC m=+148.032264485" Apr 24 23:56:03.153985 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:03.153946 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5tx5" event={"ID":"1d003339-504a-4e95-aba4-a47bafe0f0d6","Type":"ContainerStarted","Data":"76fc40c60e0fd5114b8c63a9caae37b3a6bfd9a879da9ef109f850742d23b97d"} Apr 24 23:56:03.170282 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:03.170234 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-s5tx5" podStartSLOduration=2.4981368 podStartE2EDuration="4.170222746s" podCreationTimestamp="2026-04-24 23:55:59 +0000 UTC" firstStartedPulling="2026-04-24 23:56:00.451286429 +0000 UTC m=+146.315778698" lastFinishedPulling="2026-04-24 23:56:02.123372367 +0000 UTC m=+147.987864644" observedRunningTime="2026-04-24 23:56:03.168961665 +0000 UTC m=+149.033453954" watchObservedRunningTime="2026-04-24 23:56:03.170222746 +0000 UTC m=+149.034715034" Apr 24 23:56:03.835943 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:03.835897 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:03.835943 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:03.835950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:03.836163 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:03.836082 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:07.83606216 +0000 UTC m=+153.700554428 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : configmap references non-existent config key: service-ca.crt Apr 24 23:56:03.836163 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:03.836094 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:56:03.836163 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:03.836129 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:07.83611898 +0000 UTC m=+153.700611251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : secret "router-metrics-certs-default" not found Apr 24 23:56:05.491090 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:05.491059 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tdkpx_07e0eebb-8365-490f-b2b2-16f26075fac7/dns-node-resolver/0.log" Apr 24 23:56:06.290506 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:06.290479 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nqwjk_cdb63bec-b61b-4953-b2fc-7f06ee7063ac/node-ca/0.log" Apr 24 23:56:07.869627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:07.869555 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:07.869627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:07.869637 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:07.870106 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:07.869751 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:56:07.870106 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:07.869777 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:15.869753033 +0000 UTC m=+161.734245315 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : configmap references non-existent config key: service-ca.crt Apr 24 23:56:07.870106 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:07.869849 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:15.869836928 +0000 UTC m=+161.734329195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : secret "router-metrics-certs-default" not found Apr 24 23:56:09.961012 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:09.960981 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg"] Apr 24 23:56:09.963840 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:09.963824 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:09.966481 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:09.966455 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-tzp2p\"" Apr 24 23:56:09.967411 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:09.967390 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 23:56:09.967517 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:09.967413 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:56:09.967517 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:09.967390 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 23:56:09.976267 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:09.976244 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg"] Apr 24 23:56:10.065842 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:10.065796 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" podUID="1bb33cce-974b-42c1-aafe-f821da1a3f63" Apr 24 23:56:10.078106 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:10.078076 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" podUID="e290ed95-0b8a-4c01-aa97-91f4caed9f63" Apr 24 23:56:10.085376 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:10.085347 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prkn2\" (UniqueName: \"kubernetes.io/projected/44c53046-7de7-4ebe-b217-c83e9db484c4-kube-api-access-prkn2\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:10.085501 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:10.085394 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:10.095389 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:10.095362 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-v7wqx" podUID="7fe18020-a109-4021-a4a7-567311f209f4" Apr 24 23:56:10.165655 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:10.165629 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:56:10.165784 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:10.165629 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:56:10.168146 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:10.168125 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xcntc" podUID="6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f" Apr 24 23:56:10.186351 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:10.186329 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:10.186441 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:10.186431 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkn2\" (UniqueName: \"kubernetes.io/projected/44c53046-7de7-4ebe-b217-c83e9db484c4-kube-api-access-prkn2\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:10.186503 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:10.186489 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:56:10.186561 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:10.186551 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls podName:44c53046-7de7-4ebe-b217-c83e9db484c4 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:10.68653752 +0000 UTC m=+156.551029786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wcrlg" (UID: "44c53046-7de7-4ebe-b217-c83e9db484c4") : secret "samples-operator-tls" not found Apr 24 23:56:10.195115 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:10.195095 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prkn2\" (UniqueName: \"kubernetes.io/projected/44c53046-7de7-4ebe-b217-c83e9db484c4-kube-api-access-prkn2\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:10.691170 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:10.691132 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:10.691344 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:10.691259 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:56:10.691344 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:10.691311 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls podName:44c53046-7de7-4ebe-b217-c83e9db484c4 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:11.691298191 +0000 UTC m=+157.555790457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wcrlg" (UID: "44c53046-7de7-4ebe-b217-c83e9db484c4") : secret "samples-operator-tls" not found Apr 24 23:56:11.699026 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:11.698972 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:11.699419 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:11.699135 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:56:11.699419 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:11.699204 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls podName:44c53046-7de7-4ebe-b217-c83e9db484c4 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:13.699188038 +0000 UTC m=+159.563680308 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wcrlg" (UID: "44c53046-7de7-4ebe-b217-c83e9db484c4") : secret "samples-operator-tls" not found Apr 24 23:56:11.745008 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:11.744966 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wf82j" podUID="e101d25b-89b6-4522-8e39-35b94ce4d935" Apr 24 23:56:13.714328 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:13.714293 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:13.714711 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:13.714437 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:56:13.714711 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:13.714503 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls podName:44c53046-7de7-4ebe-b217-c83e9db484c4 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:17.714488013 +0000 UTC m=+163.578980280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wcrlg" (UID: "44c53046-7de7-4ebe-b217-c83e9db484c4") : secret "samples-operator-tls" not found Apr 24 23:56:15.025683 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:15.025638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") pod \"image-registry-785f97fdcb-rww6s\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:56:15.026177 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:15.025698 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:56:15.026177 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.025772 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:56:15.026177 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.025793 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-785f97fdcb-rww6s: secret "image-registry-tls" not found Apr 24 23:56:15.026177 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.025836 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:56:15.026177 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.025853 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls podName:e290ed95-0b8a-4c01-aa97-91f4caed9f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:58:17.025838636 +0000 UTC m=+282.890330925 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls") pod "image-registry-785f97fdcb-rww6s" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63") : secret "image-registry-tls" not found Apr 24 23:56:15.026177 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.025876 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert podName:1bb33cce-974b-42c1-aafe-f821da1a3f63 nodeName:}" failed. No retries permitted until 2026-04-24 23:58:17.025865166 +0000 UTC m=+282.890357435 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t7llw" (UID: "1bb33cce-974b-42c1-aafe-f821da1a3f63") : secret "networking-console-plugin-cert" not found Apr 24 23:56:15.126998 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:15.126949 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:56:15.127167 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.127096 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:56:15.127167 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:15.127116 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:56:15.127167 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.127159 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls podName:6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f nodeName:}" failed. No retries permitted until 2026-04-24 23:58:17.12714482 +0000 UTC m=+282.991637090 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls") pod "dns-default-xcntc" (UID: "6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f") : secret "dns-default-metrics-tls" not found Apr 24 23:56:15.127271 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.127203 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:56:15.127271 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.127242 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert podName:7fe18020-a109-4021-a4a7-567311f209f4 nodeName:}" failed. No retries permitted until 2026-04-24 23:58:17.127231236 +0000 UTC m=+282.991723502 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert") pod "ingress-canary-v7wqx" (UID: "7fe18020-a109-4021-a4a7-567311f209f4") : secret "canary-serving-cert" not found Apr 24 23:56:15.932704 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:15.932663 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:15.932961 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:15.932714 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:15.932961 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.932800 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:56:15.932961 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.932820 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:31.932803473 +0000 UTC m=+177.797295739 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : configmap references non-existent config key: service-ca.crt Apr 24 23:56:15.932961 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:15.932943 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs podName:71a738cc-01a9-4ee7-ba02-0e53fcb15a2f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:31.932934925 +0000 UTC m=+177.797427190 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs") pod "router-default-c997485db-f7mxn" (UID: "71a738cc-01a9-4ee7-ba02-0e53fcb15a2f") : secret "router-metrics-certs-default" not found Apr 24 23:56:16.733904 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.733873 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lwhhh"] Apr 24 23:56:16.739220 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.739202 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.745281 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.745254 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-l8vsl\"" Apr 24 23:56:16.749202 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.749170 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lwhhh"] Apr 24 23:56:16.759644 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.759626 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:56:16.774999 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.774977 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:56:16.841258 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.841221 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/851c8e1e-3697-4187-858a-e65677890b54-data-volume\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.841258 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.841263 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.841482 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.841281 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d54tf\" (UniqueName: \"kubernetes.io/projected/851c8e1e-3697-4187-858a-e65677890b54-kube-api-access-d54tf\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.841482 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.841341 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/851c8e1e-3697-4187-858a-e65677890b54-crio-socket\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.841482 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.841384 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/851c8e1e-3697-4187-858a-e65677890b54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.942465 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.942433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/851c8e1e-3697-4187-858a-e65677890b54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.942582 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.942550 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/851c8e1e-3697-4187-858a-e65677890b54-data-volume\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.942634 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.942594 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.942634 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.942619 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d54tf\" (UniqueName: \"kubernetes.io/projected/851c8e1e-3697-4187-858a-e65677890b54-kube-api-access-d54tf\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.942737 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.942654 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/851c8e1e-3697-4187-858a-e65677890b54-crio-socket\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.942787 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:16.942733 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:16.942787 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.942744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/851c8e1e-3697-4187-858a-e65677890b54-crio-socket\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.942886 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:16.942807 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls podName:851c8e1e-3697-4187-858a-e65677890b54 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:17.442788307 +0000 UTC m=+163.307280588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-lwhhh" (UID: "851c8e1e-3697-4187-858a-e65677890b54") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:16.942970 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.942936 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/851c8e1e-3697-4187-858a-e65677890b54-data-volume\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.943021 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.943005 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/851c8e1e-3697-4187-858a-e65677890b54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:16.955986 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:16.955966 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d54tf\" (UniqueName: \"kubernetes.io/projected/851c8e1e-3697-4187-858a-e65677890b54-kube-api-access-d54tf\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:17.011059 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.010973 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-d4lkw"] Apr 24 23:56:17.015107 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.015086 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.017868 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.017834 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 23:56:17.017996 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.017865 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 23:56:17.018067 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.018013 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 23:56:17.018067 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.018044 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 23:56:17.018164 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.018140 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-9zfrw\"" Apr 24 23:56:17.025796 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.025772 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-d4lkw"] Apr 24 23:56:17.144358 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.144330 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b10c8d90-0867-47ac-8c11-877059a278c2-signing-key\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.144531 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.144380 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqzf\" (UniqueName: \"kubernetes.io/projected/b10c8d90-0867-47ac-8c11-877059a278c2-kube-api-access-vcqzf\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.144531 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.144474 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b10c8d90-0867-47ac-8c11-877059a278c2-signing-cabundle\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.245674 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.245639 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b10c8d90-0867-47ac-8c11-877059a278c2-signing-key\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.245882 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.245700 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqzf\" (UniqueName: \"kubernetes.io/projected/b10c8d90-0867-47ac-8c11-877059a278c2-kube-api-access-vcqzf\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.245882 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.245755 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b10c8d90-0867-47ac-8c11-877059a278c2-signing-cabundle\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.246393 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.246367 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b10c8d90-0867-47ac-8c11-877059a278c2-signing-cabundle\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.248009 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.247987 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b10c8d90-0867-47ac-8c11-877059a278c2-signing-key\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.254608 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.254588 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqzf\" (UniqueName: \"kubernetes.io/projected/b10c8d90-0867-47ac-8c11-877059a278c2-kube-api-access-vcqzf\") pod \"service-ca-865cb79987-d4lkw\" (UID: \"b10c8d90-0867-47ac-8c11-877059a278c2\") " pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.324320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.324240 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-d4lkw" Apr 24 23:56:17.435557 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.435524 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-d4lkw"] Apr 24 23:56:17.438781 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:17.438758 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb10c8d90_0867_47ac_8c11_877059a278c2.slice/crio-dc8e76ce2962618bb1e0d4cab5eb3b6b4c05c87fb42824430932e6dc98348277 WatchSource:0}: Error finding container dc8e76ce2962618bb1e0d4cab5eb3b6b4c05c87fb42824430932e6dc98348277: Status 404 returned error can't find the container with id dc8e76ce2962618bb1e0d4cab5eb3b6b4c05c87fb42824430932e6dc98348277 Apr 24 23:56:17.447863 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.447840 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:17.448010 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:17.447993 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:17.448061 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:17.448052 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls podName:851c8e1e-3697-4187-858a-e65677890b54 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:18.448037663 +0000 UTC m=+164.312529929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-lwhhh" (UID: "851c8e1e-3697-4187-858a-e65677890b54") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:17.750373 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:17.750341 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:17.750775 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:17.750455 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:56:17.750775 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:17.750514 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls podName:44c53046-7de7-4ebe-b217-c83e9db484c4 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:25.750499927 +0000 UTC m=+171.614992192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wcrlg" (UID: "44c53046-7de7-4ebe-b217-c83e9db484c4") : secret "samples-operator-tls" not found Apr 24 23:56:18.182368 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:18.182325 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-d4lkw" event={"ID":"b10c8d90-0867-47ac-8c11-877059a278c2","Type":"ContainerStarted","Data":"dc8e76ce2962618bb1e0d4cab5eb3b6b4c05c87fb42824430932e6dc98348277"} Apr 24 23:56:18.459218 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:18.459136 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:18.459360 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:18.459275 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:18.459405 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:18.459358 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls podName:851c8e1e-3697-4187-858a-e65677890b54 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:20.45933735 +0000 UTC m=+166.323829620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-lwhhh" (UID: "851c8e1e-3697-4187-858a-e65677890b54") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:19.186273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:19.186240 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-d4lkw" event={"ID":"b10c8d90-0867-47ac-8c11-877059a278c2","Type":"ContainerStarted","Data":"793a4576123b3dc0d664ba0b836ba2223c1a36d1ec75be38378aad3147cc49b1"} Apr 24 23:56:19.203508 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:19.203464 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-d4lkw" podStartSLOduration=1.676027605 podStartE2EDuration="3.203451878s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="2026-04-24 23:56:17.440625767 +0000 UTC m=+163.305118033" lastFinishedPulling="2026-04-24 23:56:18.968050035 +0000 UTC m=+164.832542306" observedRunningTime="2026-04-24 23:56:19.20162771 +0000 UTC m=+165.066119997" watchObservedRunningTime="2026-04-24 23:56:19.203451878 +0000 UTC m=+165.067944250" Apr 24 23:56:20.477743 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:20.477707 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:20.478292 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:20.478275 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls podName:851c8e1e-3697-4187-858a-e65677890b54 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:24.478250119 +0000 UTC m=+170.342742396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-lwhhh" (UID: "851c8e1e-3697-4187-858a-e65677890b54") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:20.478822 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:20.478796 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:20.726583 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:20.726529 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:56:24.511913 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:24.511826 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:24.512269 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:24.511994 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:24.512269 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:24.512071 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls podName:851c8e1e-3697-4187-858a-e65677890b54 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:32.512051844 +0000 UTC m=+178.376544127 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls") pod "insights-runtime-extractor-lwhhh" (UID: "851c8e1e-3697-4187-858a-e65677890b54") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:24.727522 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:24.727481 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcntc" Apr 24 23:56:24.727721 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:24.727671 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:56:25.824589 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:25.824533 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:25.826883 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:25.826853 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44c53046-7de7-4ebe-b217-c83e9db484c4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wcrlg\" (UID: \"44c53046-7de7-4ebe-b217-c83e9db484c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:25.872514 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:25.872487 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" Apr 24 23:56:25.993680 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:25.993654 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg"] Apr 24 23:56:26.205705 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:26.205671 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" event={"ID":"44c53046-7de7-4ebe-b217-c83e9db484c4","Type":"ContainerStarted","Data":"5843b32f12203cb03c461b3828471a9831a93eba7cca0418977764a1803c845e"} Apr 24 23:56:28.212747 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:28.212698 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" event={"ID":"44c53046-7de7-4ebe-b217-c83e9db484c4","Type":"ContainerStarted","Data":"64a07e6daf43359ccce48bb9d7d710a6401d4e5e5767259ae1905c2e1daf9d73"} Apr 24 23:56:28.212747 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:28.212747 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" event={"ID":"44c53046-7de7-4ebe-b217-c83e9db484c4","Type":"ContainerStarted","Data":"dbe303e09fa4911598d1cd1e4b7df73d01d7c740ddaf15350281cd612d4a9f10"} Apr 24 23:56:28.228932 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:28.228886 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wcrlg" podStartSLOduration=17.786075239 podStartE2EDuration="19.228871114s" podCreationTimestamp="2026-04-24 23:56:09 +0000 UTC" firstStartedPulling="2026-04-24 23:56:26.029554029 +0000 UTC m=+171.894046295" lastFinishedPulling="2026-04-24 23:56:27.472349898 +0000 UTC m=+173.336842170" observedRunningTime="2026-04-24 23:56:28.228275772 +0000 UTC m=+174.092768075" watchObservedRunningTime="2026-04-24 23:56:28.228871114 +0000 UTC m=+174.093363407" Apr 24 23:56:31.977362 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:31.977314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:31.977362 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:31.977372 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:31.977949 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:31.977927 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-service-ca-bundle\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:31.979680 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:31.979649 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71a738cc-01a9-4ee7-ba02-0e53fcb15a2f-metrics-certs\") pod \"router-default-c997485db-f7mxn\" (UID: \"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f\") " pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:32.211945 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:32.211903 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:32.327018 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:32.326995 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-c997485db-f7mxn"] Apr 24 23:56:32.329218 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:32.329191 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a738cc_01a9_4ee7_ba02_0e53fcb15a2f.slice/crio-68b3744a51e5d28b6c719c4df6d92513a679d0d674267704b86d3e27dbf5f4db WatchSource:0}: Error finding container 68b3744a51e5d28b6c719c4df6d92513a679d0d674267704b86d3e27dbf5f4db: Status 404 returned error can't find the container with id 68b3744a51e5d28b6c719c4df6d92513a679d0d674267704b86d3e27dbf5f4db Apr 24 23:56:32.583470 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:32.583390 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:32.585536 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:32.585517 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/851c8e1e-3697-4187-858a-e65677890b54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lwhhh\" (UID: \"851c8e1e-3697-4187-858a-e65677890b54\") " pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:32.648728 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:32.648697 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lwhhh" Apr 24 23:56:32.773973 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:32.773935 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lwhhh"] Apr 24 23:56:32.777319 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:32.777284 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod851c8e1e_3697_4187_858a_e65677890b54.slice/crio-b14154e201b6b73f55bedf9a09fb57da604e11b8a40fdddc4604f7f3ebc7b6a1 WatchSource:0}: Error finding container b14154e201b6b73f55bedf9a09fb57da604e11b8a40fdddc4604f7f3ebc7b6a1: Status 404 returned error can't find the container with id b14154e201b6b73f55bedf9a09fb57da604e11b8a40fdddc4604f7f3ebc7b6a1 Apr 24 23:56:33.228512 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:33.228473 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-c997485db-f7mxn" event={"ID":"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f","Type":"ContainerStarted","Data":"e772167134e343e702f0779c5d6b0f3e6348f125ab509b7f9028c636098a889b"} Apr 24 23:56:33.228512 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:33.228512 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-c997485db-f7mxn" event={"ID":"71a738cc-01a9-4ee7-ba02-0e53fcb15a2f","Type":"ContainerStarted","Data":"68b3744a51e5d28b6c719c4df6d92513a679d0d674267704b86d3e27dbf5f4db"} Apr 24 23:56:33.229852 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:33.229827 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lwhhh" event={"ID":"851c8e1e-3697-4187-858a-e65677890b54","Type":"ContainerStarted","Data":"0a0b3b31e0ce656a0026ff5d38261c76000032e1fa6faba5507277d0328f6831"} Apr 24 23:56:33.229852 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:33.229856 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lwhhh" event={"ID":"851c8e1e-3697-4187-858a-e65677890b54","Type":"ContainerStarted","Data":"b14154e201b6b73f55bedf9a09fb57da604e11b8a40fdddc4604f7f3ebc7b6a1"} Apr 24 23:56:33.247439 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:33.247395 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-c997485db-f7mxn" podStartSLOduration=33.247381733 podStartE2EDuration="33.247381733s" podCreationTimestamp="2026-04-24 23:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:33.245782323 +0000 UTC m=+179.110274652" watchObservedRunningTime="2026-04-24 23:56:33.247381733 +0000 UTC m=+179.111874020" Apr 24 23:56:34.212214 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:34.212182 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:34.214930 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:34.214908 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:34.235787 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:34.235611 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lwhhh" event={"ID":"851c8e1e-3697-4187-858a-e65677890b54","Type":"ContainerStarted","Data":"0cb2a700dea6531716274674a60bdee3a523d65055b0225c0a6e226ac621a4a4"} Apr 24 23:56:34.235787 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:34.235653 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:34.237637 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:34.237614 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-c997485db-f7mxn" Apr 24 23:56:35.239095 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:35.239059 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lwhhh" event={"ID":"851c8e1e-3697-4187-858a-e65677890b54","Type":"ContainerStarted","Data":"a709cdd891a832feb2958d2dcf385c30eb855a5348e417b2aa0049c3adf72792"} Apr 24 23:56:35.257325 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:35.257273 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lwhhh" podStartSLOduration=17.39673515 podStartE2EDuration="19.257259079s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="2026-04-24 23:56:32.831592133 +0000 UTC m=+178.696084404" lastFinishedPulling="2026-04-24 23:56:34.692116055 +0000 UTC m=+180.556608333" observedRunningTime="2026-04-24 23:56:35.256388755 +0000 UTC m=+181.120881044" watchObservedRunningTime="2026-04-24 23:56:35.257259079 +0000 UTC m=+181.121751422" Apr 24 23:56:40.197887 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.197851 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rnv26"] Apr 24 23:56:40.202009 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.201983 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc"] Apr 24 23:56:40.202158 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.202127 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rnv26" Apr 24 23:56:40.204561 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.204541 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 23:56:40.204691 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.204605 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 23:56:40.204691 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.204637 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-sqfwc\"" Apr 24 23:56:40.205207 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.205189 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" Apr 24 23:56:40.207726 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.207707 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 23:56:40.208094 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.208073 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-wfn2h\"" Apr 24 23:56:40.209155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.209133 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc"] Apr 24 23:56:40.212326 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.212288 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rnv26"] Apr 24 23:56:40.240474 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.240444 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/03e56db5-a036-4978-be51-99e3748fbdc4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gc5fc\" (UID: \"03e56db5-a036-4978-be51-99e3748fbdc4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" Apr 24 23:56:40.240634 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.240593 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7lkn\" (UniqueName: \"kubernetes.io/projected/b365c4f6-54a2-4538-bc8c-68262709ee19-kube-api-access-b7lkn\") pod \"downloads-6bcc868b7-rnv26\" (UID: \"b365c4f6-54a2-4538-bc8c-68262709ee19\") " pod="openshift-console/downloads-6bcc868b7-rnv26" Apr 24 23:56:40.341946 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.341915 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/03e56db5-a036-4978-be51-99e3748fbdc4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gc5fc\" (UID: \"03e56db5-a036-4978-be51-99e3748fbdc4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" Apr 24 23:56:40.342103 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.342008 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7lkn\" (UniqueName: \"kubernetes.io/projected/b365c4f6-54a2-4538-bc8c-68262709ee19-kube-api-access-b7lkn\") pod \"downloads-6bcc868b7-rnv26\" (UID: \"b365c4f6-54a2-4538-bc8c-68262709ee19\") " pod="openshift-console/downloads-6bcc868b7-rnv26" Apr 24 23:56:40.344232 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.344212 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/03e56db5-a036-4978-be51-99e3748fbdc4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gc5fc\" (UID: \"03e56db5-a036-4978-be51-99e3748fbdc4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" Apr 24 23:56:40.350604 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.350551 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7lkn\" (UniqueName: \"kubernetes.io/projected/b365c4f6-54a2-4538-bc8c-68262709ee19-kube-api-access-b7lkn\") pod \"downloads-6bcc868b7-rnv26\" (UID: \"b365c4f6-54a2-4538-bc8c-68262709ee19\") " pod="openshift-console/downloads-6bcc868b7-rnv26" Apr 24 23:56:40.514069 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.513976 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rnv26" Apr 24 23:56:40.520154 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.520132 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" Apr 24 23:56:40.639908 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.639880 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rnv26"] Apr 24 23:56:40.642765 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:40.642735 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb365c4f6_54a2_4538_bc8c_68262709ee19.slice/crio-bc2d32e300c8890e463c4e061e49285ba5fb15b0023dae8f7b5d2655e56229cf WatchSource:0}: Error finding container bc2d32e300c8890e463c4e061e49285ba5fb15b0023dae8f7b5d2655e56229cf: Status 404 returned error can't find the container with id bc2d32e300c8890e463c4e061e49285ba5fb15b0023dae8f7b5d2655e56229cf Apr 24 23:56:40.655352 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:40.655329 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc"] Apr 24 23:56:40.657635 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:40.657608 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e56db5_a036_4978_be51_99e3748fbdc4.slice/crio-0b2becee59d237a9d580702cc52c4a7d347b3c7cbb8da846c2d7e533afc1bc72 WatchSource:0}: Error finding container 0b2becee59d237a9d580702cc52c4a7d347b3c7cbb8da846c2d7e533afc1bc72: Status 404 returned error can't find the container with id 0b2becee59d237a9d580702cc52c4a7d347b3c7cbb8da846c2d7e533afc1bc72 Apr 24 23:56:41.257515 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:41.257440 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" event={"ID":"03e56db5-a036-4978-be51-99e3748fbdc4","Type":"ContainerStarted","Data":"0b2becee59d237a9d580702cc52c4a7d347b3c7cbb8da846c2d7e533afc1bc72"} Apr 24 23:56:41.258507 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:41.258475 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rnv26" event={"ID":"b365c4f6-54a2-4538-bc8c-68262709ee19","Type":"ContainerStarted","Data":"bc2d32e300c8890e463c4e061e49285ba5fb15b0023dae8f7b5d2655e56229cf"} Apr 24 23:56:42.262980 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.262936 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" event={"ID":"03e56db5-a036-4978-be51-99e3748fbdc4","Type":"ContainerStarted","Data":"b0ed78e11723cf82db02f31451d81f8f07cf2a25953466d96cd168dc9c05093e"} Apr 24 23:56:42.263407 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.263155 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" Apr 24 23:56:42.268783 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.268757 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" Apr 24 23:56:42.278617 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.278560 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gc5fc" podStartSLOduration=1.248315125 podStartE2EDuration="2.278546588s" podCreationTimestamp="2026-04-24 23:56:40 +0000 UTC" firstStartedPulling="2026-04-24 23:56:40.659388527 +0000 UTC m=+186.523880795" lastFinishedPulling="2026-04-24 23:56:41.689619992 +0000 UTC m=+187.554112258" observedRunningTime="2026-04-24 23:56:42.277248602 +0000 UTC m=+188.141740891" watchObservedRunningTime="2026-04-24 23:56:42.278546588 +0000 UTC m=+188.143038875" Apr 24 23:56:42.360469 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.360056 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f79d67b9c-9vgk6"] Apr 24 23:56:42.363749 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.363727 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.366788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.366328 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 23:56:42.366788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.366345 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 23:56:42.366788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.366420 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 23:56:42.366788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.366333 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 23:56:42.366788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.366608 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nptwj\"" Apr 24 23:56:42.366788 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.366674 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 23:56:42.373482 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.373458 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f79d67b9c-9vgk6"] Apr 24 23:56:42.424532 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.424496 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-x6n29"] Apr 24 23:56:42.428149 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.428127 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.431160 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.430986 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:56:42.431160 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.430986 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:56:42.431160 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.431086 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-x6tlm\"" Apr 24 23:56:42.431383 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.431199 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 23:56:42.431383 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.431275 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 23:56:42.431383 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.431304 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:56:42.435499 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.435478 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-x6n29"] Apr 24 23:56:42.461235 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461152 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-config\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.461385 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461253 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.461467 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461377 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.461519 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461446 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.461597 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461518 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-oauth-config\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.461597 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461551 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-service-ca\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.461682 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461618 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-oauth-serving-cert\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.461682 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461658 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-serving-cert\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.461786 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461683 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f57vv\" (UniqueName: \"kubernetes.io/projected/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-kube-api-access-f57vv\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.461786 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.461733 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv88n\" (UniqueName: \"kubernetes.io/projected/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-kube-api-access-bv88n\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.562631 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562511 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-config\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.562631 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562616 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.562856 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562711 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.562856 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.562856 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562778 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-oauth-config\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.562856 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562807 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-service-ca\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.562856 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562852 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-oauth-serving-cert\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.563039 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562887 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-serving-cert\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.563039 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562912 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f57vv\" (UniqueName: \"kubernetes.io/projected/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-kube-api-access-f57vv\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.563039 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:42.562939 2566 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 23:56:42.563039 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.562950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv88n\" (UniqueName: \"kubernetes.io/projected/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-kube-api-access-bv88n\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.563039 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:42.563008 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-tls podName:59b7a531-2098-4fe7-a25f-ff5d0983ec0c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:43.062988129 +0000 UTC m=+188.927480414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-x6n29" (UID: "59b7a531-2098-4fe7-a25f-ff5d0983ec0c") : secret "prometheus-operator-tls" not found Apr 24 23:56:42.564055 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.563897 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.564055 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.563937 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-config\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.565012 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.564955 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-service-ca\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.565669 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.565610 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-oauth-serving-cert\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.566225 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.566200 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-oauth-config\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.566446 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.566422 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.567202 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.567165 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-serving-cert\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.578168 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.578146 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f57vv\" (UniqueName: \"kubernetes.io/projected/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-kube-api-access-f57vv\") pod \"console-6f79d67b9c-9vgk6\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.578273 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.578206 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv88n\" (UniqueName: \"kubernetes.io/projected/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-kube-api-access-bv88n\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:42.677180 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.677130 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:42.807774 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:42.807726 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f79d67b9c-9vgk6"] Apr 24 23:56:42.810140 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:42.810107 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a7f4a54_a1e1_42ba_9ffc_f1f8dcb8e49f.slice/crio-a0da1b1c242dbe1924f4b799487ac377f8d74f3fcd406bc3008f66232b17f19d WatchSource:0}: Error finding container a0da1b1c242dbe1924f4b799487ac377f8d74f3fcd406bc3008f66232b17f19d: Status 404 returned error can't find the container with id a0da1b1c242dbe1924f4b799487ac377f8d74f3fcd406bc3008f66232b17f19d Apr 24 23:56:43.067924 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:43.067838 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:43.070501 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:43.070476 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/59b7a531-2098-4fe7-a25f-ff5d0983ec0c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-x6n29\" (UID: \"59b7a531-2098-4fe7-a25f-ff5d0983ec0c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:43.268098 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:43.268041 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f79d67b9c-9vgk6" event={"ID":"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f","Type":"ContainerStarted","Data":"a0da1b1c242dbe1924f4b799487ac377f8d74f3fcd406bc3008f66232b17f19d"} Apr 24 23:56:43.339668 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:43.339503 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" Apr 24 23:56:43.480862 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:43.480799 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-x6n29"] Apr 24 23:56:43.484345 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:43.484311 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59b7a531_2098_4fe7_a25f_ff5d0983ec0c.slice/crio-329dd8dc9ad396d640c155f8396e3f1262a52cb4e2889543adc4cfe044477a82 WatchSource:0}: Error finding container 329dd8dc9ad396d640c155f8396e3f1262a52cb4e2889543adc4cfe044477a82: Status 404 returned error can't find the container with id 329dd8dc9ad396d640c155f8396e3f1262a52cb4e2889543adc4cfe044477a82 Apr 24 23:56:44.274763 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:44.274713 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" event={"ID":"59b7a531-2098-4fe7-a25f-ff5d0983ec0c","Type":"ContainerStarted","Data":"329dd8dc9ad396d640c155f8396e3f1262a52cb4e2889543adc4cfe044477a82"} Apr 24 23:56:46.285299 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:46.285171 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" event={"ID":"59b7a531-2098-4fe7-a25f-ff5d0983ec0c","Type":"ContainerStarted","Data":"507db2b58563d016cd182734c086d3b5eefbfd1de4f79d577cf1a75ef867b98c"} Apr 24 23:56:46.285299 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:46.285225 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" event={"ID":"59b7a531-2098-4fe7-a25f-ff5d0983ec0c","Type":"ContainerStarted","Data":"c441d3aceb51fb099b66daf779665f1ce3cf49ffb0dad006ad37ef1eaa12af61"} Apr 24 23:56:46.286775 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:46.286746 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f79d67b9c-9vgk6" event={"ID":"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f","Type":"ContainerStarted","Data":"006a79a4e0cfb9aed1c7aa3cf01f0dd2e9e76b1fb37ab90bb802ed2f7846d0dd"} Apr 24 23:56:46.309232 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:46.309173 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-x6n29" podStartSLOduration=1.798419387 podStartE2EDuration="4.309155514s" podCreationTimestamp="2026-04-24 23:56:42 +0000 UTC" firstStartedPulling="2026-04-24 23:56:43.486727709 +0000 UTC m=+189.351219979" lastFinishedPulling="2026-04-24 23:56:45.997463837 +0000 UTC m=+191.861956106" observedRunningTime="2026-04-24 23:56:46.307146685 +0000 UTC m=+192.171638973" watchObservedRunningTime="2026-04-24 23:56:46.309155514 +0000 UTC m=+192.173647804" Apr 24 23:56:46.327084 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:46.327030 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f79d67b9c-9vgk6" podStartSLOduration=1.136131669 podStartE2EDuration="4.327015454s" podCreationTimestamp="2026-04-24 23:56:42 +0000 UTC" firstStartedPulling="2026-04-24 23:56:42.812232471 +0000 UTC m=+188.676724759" lastFinishedPulling="2026-04-24 23:56:46.003116273 +0000 UTC m=+191.867608544" observedRunningTime="2026-04-24 23:56:46.325468111 +0000 UTC m=+192.189960398" watchObservedRunningTime="2026-04-24 23:56:46.327015454 +0000 UTC m=+192.191507742" Apr 24 23:56:47.766967 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.766344 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qxt66"] Apr 24 23:56:47.771209 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.770319 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.773737 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.772975 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7cj9k\"" Apr 24 23:56:47.773737 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.773181 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:56:47.773737 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.773376 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:56:47.773737 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.773594 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:56:47.812212 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812176 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.812366 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812221 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-textfile\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.812366 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812273 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-sys\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.812366 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-tls\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.812366 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812356 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2506cd54-279b-4478-bf09-69d1721b7bee-metrics-client-ca\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.812600 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812399 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxn7c\" (UniqueName: \"kubernetes.io/projected/2506cd54-279b-4478-bf09-69d1721b7bee-kube-api-access-dxn7c\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.812600 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812438 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-wtmp\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.812600 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812459 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-root\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.812600 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.812513 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-accelerators-collector-config\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913237 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913204 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913459 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913251 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-textfile\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913459 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913377 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-sys\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913459 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913428 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-tls\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913645 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913508 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2506cd54-279b-4478-bf09-69d1721b7bee-metrics-client-ca\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913645 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913583 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxn7c\" (UniqueName: \"kubernetes.io/projected/2506cd54-279b-4478-bf09-69d1721b7bee-kube-api-access-dxn7c\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913645 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913593 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-textfile\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913795 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913647 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-wtmp\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913795 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913671 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-root\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913795 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.913712 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-accelerators-collector-config\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.913943 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:47.913925 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 23:56:47.913997 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:56:47.913986 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-tls podName:2506cd54-279b-4478-bf09-69d1721b7bee nodeName:}" failed. No retries permitted until 2026-04-24 23:56:48.413964699 +0000 UTC m=+194.278456969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-tls") pod "node-exporter-qxt66" (UID: "2506cd54-279b-4478-bf09-69d1721b7bee") : secret "node-exporter-tls" not found Apr 24 23:56:47.914059 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.914038 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-sys\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.914320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.914280 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-root\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.914443 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.914363 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-wtmp\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.914820 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.914796 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-accelerators-collector-config\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.914931 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.914900 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2506cd54-279b-4478-bf09-69d1721b7bee-metrics-client-ca\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.917200 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.917176 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:47.926208 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:47.926184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxn7c\" (UniqueName: \"kubernetes.io/projected/2506cd54-279b-4478-bf09-69d1721b7bee-kube-api-access-dxn7c\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:48.419526 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:48.419490 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-tls\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:48.422174 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:48.422148 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2506cd54-279b-4478-bf09-69d1721b7bee-node-exporter-tls\") pod \"node-exporter-qxt66\" (UID: \"2506cd54-279b-4478-bf09-69d1721b7bee\") " pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:48.684539 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:48.684508 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qxt66" Apr 24 23:56:48.694329 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:48.694284 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2506cd54_279b_4478_bf09_69d1721b7bee.slice/crio-37a0cbf5e650c0a658e2358585bc62973412d1755c37c61cd6e1fc8dd4020c57 WatchSource:0}: Error finding container 37a0cbf5e650c0a658e2358585bc62973412d1755c37c61cd6e1fc8dd4020c57: Status 404 returned error can't find the container with id 37a0cbf5e650c0a658e2358585bc62973412d1755c37c61cd6e1fc8dd4020c57 Apr 24 23:56:49.297032 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:49.296991 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qxt66" event={"ID":"2506cd54-279b-4478-bf09-69d1721b7bee","Type":"ContainerStarted","Data":"37a0cbf5e650c0a658e2358585bc62973412d1755c37c61cd6e1fc8dd4020c57"} Apr 24 23:56:50.301204 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.301154 2566 generic.go:358] "Generic (PLEG): container finished" podID="2506cd54-279b-4478-bf09-69d1721b7bee" containerID="4acae0e7b1d5757fb4de74502ec9a81057cd771342cc293848f06a6a8cfc0829" exitCode=0 Apr 24 23:56:50.301744 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.301227 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qxt66" event={"ID":"2506cd54-279b-4478-bf09-69d1721b7bee","Type":"ContainerDied","Data":"4acae0e7b1d5757fb4de74502ec9a81057cd771342cc293848f06a6a8cfc0829"} Apr 24 23:56:50.743320 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.743282 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-69d9f4f478-2gp9t"] Apr 24 23:56:50.748502 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.748476 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5474594f84-8bcfv"] Apr 24 23:56:50.748723 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.748696 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.751716 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.751678 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 23:56:50.751902 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.751805 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.752043 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.752015 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 23:56:50.752188 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.752175 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5kn7ts0135ah6\"" Apr 24 23:56:50.752799 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.752645 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 23:56:50.752799 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.752660 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 23:56:50.752799 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.752707 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-kkskq\"" Apr 24 23:56:50.753070 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.752969 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 23:56:50.759992 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.759950 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5474594f84-8bcfv"] Apr 24 23:56:50.761531 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.761510 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 23:56:50.762267 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.762245 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-69d9f4f478-2gp9t"] Apr 24 23:56:50.840227 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840198 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-oauth-serving-cert\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.840423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840237 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-serving-cert\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.840423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840257 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.840423 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840312 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.840614 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840443 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-config\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.840614 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840474 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-oauth-config\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.840614 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840498 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-service-ca\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.840614 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840544 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.840845 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840645 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-grpc-tls\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.840845 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840680 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-765zv\" (UniqueName: \"kubernetes.io/projected/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-kube-api-access-765zv\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.840845 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840716 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-metrics-client-ca\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.840845 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840786 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.840845 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840828 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-tls\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.841062 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840877 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptrt4\" (UniqueName: \"kubernetes.io/projected/47eefd6c-a017-4a64-858e-0b60d8c07db9-kube-api-access-ptrt4\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.841062 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.840904 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-trusted-ca-bundle\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.941793 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.941762 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-765zv\" (UniqueName: \"kubernetes.io/projected/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-kube-api-access-765zv\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.941793 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.941800 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-metrics-client-ca\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.942024 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.941828 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.942024 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.941854 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-tls\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.942024 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.941896 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptrt4\" (UniqueName: \"kubernetes.io/projected/47eefd6c-a017-4a64-858e-0b60d8c07db9-kube-api-access-ptrt4\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.942140 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942061 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-trusted-ca-bundle\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.942140 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942095 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-oauth-serving-cert\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.942140 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-serving-cert\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.942283 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942156 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.942283 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942184 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.942359 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942306 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-config\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.942359 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942338 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-oauth-config\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.942458 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942365 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-service-ca\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.942458 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942399 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.942650 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942629 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-metrics-client-ca\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.942901 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.942878 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-oauth-serving-cert\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.943069 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.943046 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-config\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.943605 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.943225 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-grpc-tls\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.943713 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.943693 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-trusted-ca-bundle\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.945052 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.945024 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-service-ca\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.945714 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.945692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.946560 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.946535 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-serving-cert\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.946849 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.946805 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-tls\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.947097 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.947052 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-grpc-tls\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.947287 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.947259 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.947730 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.947708 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-oauth-config\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:50.948624 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.948551 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.949702 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.949654 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.950521 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.950502 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-765zv\" (UniqueName: \"kubernetes.io/projected/26a6d56b-6407-48fb-bc9c-72e2a36ad99f-kube-api-access-765zv\") pod \"thanos-querier-69d9f4f478-2gp9t\" (UID: \"26a6d56b-6407-48fb-bc9c-72e2a36ad99f\") " pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:50.950671 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:50.950649 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptrt4\" (UniqueName: \"kubernetes.io/projected/47eefd6c-a017-4a64-858e-0b60d8c07db9-kube-api-access-ptrt4\") pod \"console-5474594f84-8bcfv\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:51.064171 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:51.064083 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:56:51.070948 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:51.070924 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:56:52.156182 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.154858 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8569fb9b5c-8mzsg"] Apr 24 23:56:52.158317 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.158288 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.161980 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.161955 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 23:56:52.162123 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.161995 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-b2hnc\"" Apr 24 23:56:52.162123 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.162006 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 23:56:52.162123 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.162015 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 23:56:52.162123 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.162024 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-vmeaoof7rm34\"" Apr 24 23:56:52.162396 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.162375 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 23:56:52.168671 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.168644 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8569fb9b5c-8mzsg"] Apr 24 23:56:52.255286 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.255253 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8w7t\" (UniqueName: \"kubernetes.io/projected/7a35a06b-eb3e-4b02-86b4-6b9b66124779-kube-api-access-j8w7t\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.255477 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.255381 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7a35a06b-eb3e-4b02-86b4-6b9b66124779-metrics-server-audit-profiles\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.255477 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.255415 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-secret-metrics-server-client-certs\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.255477 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.255443 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a35a06b-eb3e-4b02-86b4-6b9b66124779-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.255664 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.255519 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-client-ca-bundle\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.255664 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.255560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7a35a06b-eb3e-4b02-86b4-6b9b66124779-audit-log\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.255664 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.255631 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-secret-metrics-server-tls\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.356955 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.356912 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8w7t\" (UniqueName: \"kubernetes.io/projected/7a35a06b-eb3e-4b02-86b4-6b9b66124779-kube-api-access-j8w7t\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.357142 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.357018 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7a35a06b-eb3e-4b02-86b4-6b9b66124779-metrics-server-audit-profiles\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.357142 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.357051 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-secret-metrics-server-client-certs\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.357142 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.357075 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a35a06b-eb3e-4b02-86b4-6b9b66124779-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.357142 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.357118 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-client-ca-bundle\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.357346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.357164 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7a35a06b-eb3e-4b02-86b4-6b9b66124779-audit-log\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.357346 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.357223 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-secret-metrics-server-tls\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.357854 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.357820 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7a35a06b-eb3e-4b02-86b4-6b9b66124779-audit-log\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.357997 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.357948 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a35a06b-eb3e-4b02-86b4-6b9b66124779-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.358191 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.358163 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7a35a06b-eb3e-4b02-86b4-6b9b66124779-metrics-server-audit-profiles\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.360006 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.359979 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-client-ca-bundle\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.360109 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.360010 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-secret-metrics-server-tls\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.360357 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.360338 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7a35a06b-eb3e-4b02-86b4-6b9b66124779-secret-metrics-server-client-certs\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.393033 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.392997 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8w7t\" (UniqueName: \"kubernetes.io/projected/7a35a06b-eb3e-4b02-86b4-6b9b66124779-kube-api-access-j8w7t\") pod \"metrics-server-8569fb9b5c-8mzsg\" (UID: \"7a35a06b-eb3e-4b02-86b4-6b9b66124779\") " pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.470223 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.470193 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:56:52.678514 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.678074 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:52.678514 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.678130 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:56:52.679733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.679692 2566 patch_prober.go:28] interesting pod/console-6f79d67b9c-9vgk6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.18:8443/health\": dial tcp 10.132.0.18:8443: connect: connection refused" start-of-body= Apr 24 23:56:52.679858 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:52.679750 2566 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6f79d67b9c-9vgk6" podUID="0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" containerName="console" probeResult="failure" output="Get \"https://10.132.0.18:8443/health\": dial tcp 10.132.0.18:8443: connect: connection refused" Apr 24 23:56:57.203630 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:57.203603 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-69d9f4f478-2gp9t"] Apr 24 23:56:57.205325 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:57.205294 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26a6d56b_6407_48fb_bc9c_72e2a36ad99f.slice/crio-7c15b27b62b5b4b3497a362a5cbfe2c5debe5dc6a39e2e0902faa01c11b7dcc8 WatchSource:0}: Error finding container 7c15b27b62b5b4b3497a362a5cbfe2c5debe5dc6a39e2e0902faa01c11b7dcc8: Status 404 returned error can't find the container with id 7c15b27b62b5b4b3497a362a5cbfe2c5debe5dc6a39e2e0902faa01c11b7dcc8 Apr 24 23:56:57.324749 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:57.324657 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" event={"ID":"26a6d56b-6407-48fb-bc9c-72e2a36ad99f","Type":"ContainerStarted","Data":"7c15b27b62b5b4b3497a362a5cbfe2c5debe5dc6a39e2e0902faa01c11b7dcc8"} Apr 24 23:56:57.326210 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:57.326179 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qxt66" event={"ID":"2506cd54-279b-4478-bf09-69d1721b7bee","Type":"ContainerStarted","Data":"2a359b4d76aeb6a483b7951e13d5e98b47511b30e342687440368c89de7b0f04"} Apr 24 23:56:57.419185 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:57.419161 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8569fb9b5c-8mzsg"] Apr 24 23:56:57.421532 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:57.421498 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a35a06b_eb3e_4b02_86b4_6b9b66124779.slice/crio-a41d002856b62fb95ac677b33163d09fc18064ba9c14eb04d0fea7c933fd4eb5 WatchSource:0}: Error finding container a41d002856b62fb95ac677b33163d09fc18064ba9c14eb04d0fea7c933fd4eb5: Status 404 returned error can't find the container with id a41d002856b62fb95ac677b33163d09fc18064ba9c14eb04d0fea7c933fd4eb5 Apr 24 23:56:57.422669 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:57.422639 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5474594f84-8bcfv"] Apr 24 23:56:57.426724 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:57.426701 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47eefd6c_a017_4a64_858e_0b60d8c07db9.slice/crio-169569f5562cb44e135bdec064e313858d50512415057cea963a440c12728ed5 WatchSource:0}: Error finding container 169569f5562cb44e135bdec064e313858d50512415057cea963a440c12728ed5: Status 404 returned error can't find the container with id 169569f5562cb44e135bdec064e313858d50512415057cea963a440c12728ed5 Apr 24 23:56:58.175230 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.175177 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f79d67b9c-9vgk6"] Apr 24 23:56:58.198174 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.198119 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68f7dc7566-rp2x4"] Apr 24 23:56:58.202008 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.201975 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.208900 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.208860 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f7dc7566-rp2x4"] Apr 24 23:56:58.326664 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.326633 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97mm4\" (UniqueName: \"kubernetes.io/projected/71d8cfb9-113b-476c-81b2-993b02ee81f3-kube-api-access-97mm4\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.326932 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.326678 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-oauth-config\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.326932 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.326740 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-config\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.327262 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.327125 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-oauth-serving-cert\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.327410 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.327391 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-serving-cert\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.327510 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.327438 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-trusted-ca-bundle\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.328210 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.327978 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-service-ca\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.332880 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.332850 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qxt66" event={"ID":"2506cd54-279b-4478-bf09-69d1721b7bee","Type":"ContainerStarted","Data":"7e3cea14889029ff7f11cc8cd2fd452c817facb18e279a715774743466ad0ba6"} Apr 24 23:56:58.334806 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.334719 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" event={"ID":"7a35a06b-eb3e-4b02-86b4-6b9b66124779","Type":"ContainerStarted","Data":"a41d002856b62fb95ac677b33163d09fc18064ba9c14eb04d0fea7c933fd4eb5"} Apr 24 23:56:58.337339 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.337314 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5474594f84-8bcfv" event={"ID":"47eefd6c-a017-4a64-858e-0b60d8c07db9","Type":"ContainerStarted","Data":"77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781"} Apr 24 23:56:58.337439 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.337345 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5474594f84-8bcfv" event={"ID":"47eefd6c-a017-4a64-858e-0b60d8c07db9","Type":"ContainerStarted","Data":"169569f5562cb44e135bdec064e313858d50512415057cea963a440c12728ed5"} Apr 24 23:56:58.341065 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.338858 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rnv26" event={"ID":"b365c4f6-54a2-4538-bc8c-68262709ee19","Type":"ContainerStarted","Data":"bf42745d0c3dc41c3be46b0f783890d2b8c900d724d75471cb283bce6e5d9835"} Apr 24 23:56:58.341065 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.339417 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rnv26" Apr 24 23:56:58.350635 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.350611 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rnv26" Apr 24 23:56:58.385813 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.385663 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qxt66" podStartSLOduration=10.635214372 podStartE2EDuration="11.385644317s" podCreationTimestamp="2026-04-24 23:56:47 +0000 UTC" firstStartedPulling="2026-04-24 23:56:48.696956504 +0000 UTC m=+194.561448784" lastFinishedPulling="2026-04-24 23:56:49.447386448 +0000 UTC m=+195.311878729" observedRunningTime="2026-04-24 23:56:58.383776249 +0000 UTC m=+204.248268538" watchObservedRunningTime="2026-04-24 23:56:58.385644317 +0000 UTC m=+204.250136601" Apr 24 23:56:58.402983 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.402268 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rnv26" podStartSLOduration=1.554663984 podStartE2EDuration="18.402250903s" podCreationTimestamp="2026-04-24 23:56:40 +0000 UTC" firstStartedPulling="2026-04-24 23:56:40.644483249 +0000 UTC m=+186.508975514" lastFinishedPulling="2026-04-24 23:56:57.492070155 +0000 UTC m=+203.356562433" observedRunningTime="2026-04-24 23:56:58.399473675 +0000 UTC m=+204.263965974" watchObservedRunningTime="2026-04-24 23:56:58.402250903 +0000 UTC m=+204.266743191" Apr 24 23:56:58.430648 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.429356 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-serving-cert\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.430648 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.429453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-trusted-ca-bundle\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.430648 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.429502 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-service-ca\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.430648 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.429536 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97mm4\" (UniqueName: \"kubernetes.io/projected/71d8cfb9-113b-476c-81b2-993b02ee81f3-kube-api-access-97mm4\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.430648 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.429607 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-oauth-config\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.430648 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.429663 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-config\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.430648 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.429784 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-oauth-serving-cert\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.430648 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.430498 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-oauth-serving-cert\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.431202 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.431082 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-service-ca\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.431589 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.431525 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-config\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.432100 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.432032 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-trusted-ca-bundle\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.434074 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.433946 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-oauth-config\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.435315 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.435282 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-serving-cert\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.443220 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.443024 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97mm4\" (UniqueName: \"kubernetes.io/projected/71d8cfb9-113b-476c-81b2-993b02ee81f3-kube-api-access-97mm4\") pod \"console-68f7dc7566-rp2x4\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.517319 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.517161 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:56:58.662109 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.662049 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5474594f84-8bcfv" podStartSLOduration=8.662026089 podStartE2EDuration="8.662026089s" podCreationTimestamp="2026-04-24 23:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:58.417447824 +0000 UTC m=+204.281940114" watchObservedRunningTime="2026-04-24 23:56:58.662026089 +0000 UTC m=+204.526518377" Apr 24 23:56:58.663592 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:58.663547 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f7dc7566-rp2x4"] Apr 24 23:56:59.121798 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:56:59.121748 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71d8cfb9_113b_476c_81b2_993b02ee81f3.slice/crio-1968e188655dbe1aa41b7201ec823f592433496b3afacb81d85dc1cc2b503ed5 WatchSource:0}: Error finding container 1968e188655dbe1aa41b7201ec823f592433496b3afacb81d85dc1cc2b503ed5: Status 404 returned error can't find the container with id 1968e188655dbe1aa41b7201ec823f592433496b3afacb81d85dc1cc2b503ed5 Apr 24 23:56:59.343913 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:56:59.343873 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f7dc7566-rp2x4" event={"ID":"71d8cfb9-113b-476c-81b2-993b02ee81f3","Type":"ContainerStarted","Data":"1968e188655dbe1aa41b7201ec823f592433496b3afacb81d85dc1cc2b503ed5"} Apr 24 23:57:00.349211 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:00.349129 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" event={"ID":"26a6d56b-6407-48fb-bc9c-72e2a36ad99f","Type":"ContainerStarted","Data":"959a88ce34050d474575ad5a34cb2e3e5a2f1aa8c1f9a0973b98302be0709407"} Apr 24 23:57:00.349211 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:00.349174 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" event={"ID":"26a6d56b-6407-48fb-bc9c-72e2a36ad99f","Type":"ContainerStarted","Data":"ed938e3bd9e08a005d76a24ef7c82a3458bb8af20cdab247e2a14c76edd563f4"} Apr 24 23:57:00.349211 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:00.349191 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" event={"ID":"26a6d56b-6407-48fb-bc9c-72e2a36ad99f","Type":"ContainerStarted","Data":"306d89c528c6ae58505edea87929c437e2d03c149f7669822784f3e273b48394"} Apr 24 23:57:00.350768 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:00.350743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f7dc7566-rp2x4" event={"ID":"71d8cfb9-113b-476c-81b2-993b02ee81f3","Type":"ContainerStarted","Data":"6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464"} Apr 24 23:57:00.352585 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:00.352535 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" event={"ID":"7a35a06b-eb3e-4b02-86b4-6b9b66124779","Type":"ContainerStarted","Data":"a2b44a9483a922333145e62c7cfa0a9021fcef303f8de895634afb303ef95ab7"} Apr 24 23:57:00.368936 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:00.368892 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68f7dc7566-rp2x4" podStartSLOduration=2.368873844 podStartE2EDuration="2.368873844s" podCreationTimestamp="2026-04-24 23:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:57:00.366990967 +0000 UTC m=+206.231483255" watchObservedRunningTime="2026-04-24 23:57:00.368873844 +0000 UTC m=+206.233366132" Apr 24 23:57:00.383912 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:00.383856 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" podStartSLOduration=6.020783878 podStartE2EDuration="8.383839388s" podCreationTimestamp="2026-04-24 23:56:52 +0000 UTC" firstStartedPulling="2026-04-24 23:56:57.4443779 +0000 UTC m=+203.308870184" lastFinishedPulling="2026-04-24 23:56:59.807433428 +0000 UTC m=+205.671925694" observedRunningTime="2026-04-24 23:57:00.381886108 +0000 UTC m=+206.246378396" watchObservedRunningTime="2026-04-24 23:57:00.383839388 +0000 UTC m=+206.248331676" Apr 24 23:57:01.072118 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:01.072076 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:57:01.072341 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:01.072131 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:57:01.078849 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:01.078820 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:57:01.359524 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:01.359448 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:57:02.042266 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.042227 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-785f97fdcb-rww6s"] Apr 24 23:57:02.042639 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:57:02.042611 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" podUID="e290ed95-0b8a-4c01-aa97-91f4caed9f63" Apr 24 23:57:02.361747 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.361702 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" event={"ID":"26a6d56b-6407-48fb-bc9c-72e2a36ad99f","Type":"ContainerStarted","Data":"a267204482a3bf287e56c974217cffe52b0f6a249e69a2e4f6b7d105d907461c"} Apr 24 23:57:02.361747 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.361730 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:57:02.361747 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.361748 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" event={"ID":"26a6d56b-6407-48fb-bc9c-72e2a36ad99f","Type":"ContainerStarted","Data":"e5178d43627698ff7493fd4c016a24830410cb6cfca8816fca906542d22d90a9"} Apr 24 23:57:02.362306 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.361762 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" event={"ID":"26a6d56b-6407-48fb-bc9c-72e2a36ad99f","Type":"ContainerStarted","Data":"e3d49e75a6914a74ee750aab785b707d390ad901f8a8f36dac57d222bd6d9b7e"} Apr 24 23:57:02.362306 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.362028 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:57:02.367294 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.367270 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:57:02.386132 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.386055 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" podStartSLOduration=7.994645993 podStartE2EDuration="12.386040095s" podCreationTimestamp="2026-04-24 23:56:50 +0000 UTC" firstStartedPulling="2026-04-24 23:56:57.207746448 +0000 UTC m=+203.072238714" lastFinishedPulling="2026-04-24 23:57:01.599140542 +0000 UTC m=+207.463632816" observedRunningTime="2026-04-24 23:57:02.384822944 +0000 UTC m=+208.249315234" watchObservedRunningTime="2026-04-24 23:57:02.386040095 +0000 UTC m=+208.250532382" Apr 24 23:57:02.470894 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.470857 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-bound-sa-token\") pod \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " Apr 24 23:57:02.471083 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.470923 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-trusted-ca\") pod \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " Apr 24 23:57:02.471083 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.470954 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-image-registry-private-configuration\") pod \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " Apr 24 23:57:02.471083 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.471004 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e290ed95-0b8a-4c01-aa97-91f4caed9f63-ca-trust-extracted\") pod \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " Apr 24 23:57:02.471241 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.471118 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-certificates\") pod \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " Apr 24 23:57:02.471241 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.471197 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v99tr\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-kube-api-access-v99tr\") pod \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " Apr 24 23:57:02.471241 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.471221 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e290ed95-0b8a-4c01-aa97-91f4caed9f63-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e290ed95-0b8a-4c01-aa97-91f4caed9f63" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:02.471340 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.471243 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-installation-pull-secrets\") pod \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\" (UID: \"e290ed95-0b8a-4c01-aa97-91f4caed9f63\") " Apr 24 23:57:02.471415 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.471391 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e290ed95-0b8a-4c01-aa97-91f4caed9f63" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:02.471486 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.471388 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e290ed95-0b8a-4c01-aa97-91f4caed9f63" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:02.479271 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.473824 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e290ed95-0b8a-4c01-aa97-91f4caed9f63-ca-trust-extracted\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:02.479271 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.473854 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-certificates\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:02.479271 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.473883 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e290ed95-0b8a-4c01-aa97-91f4caed9f63-trusted-ca\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:02.481331 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.480350 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e290ed95-0b8a-4c01-aa97-91f4caed9f63" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:02.481331 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.480618 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e290ed95-0b8a-4c01-aa97-91f4caed9f63" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:02.481331 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.480626 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e290ed95-0b8a-4c01-aa97-91f4caed9f63" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:02.481687 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.481647 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-kube-api-access-v99tr" (OuterVolumeSpecName: "kube-api-access-v99tr") pod "e290ed95-0b8a-4c01-aa97-91f4caed9f63" (UID: "e290ed95-0b8a-4c01-aa97-91f4caed9f63"). InnerVolumeSpecName "kube-api-access-v99tr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:02.574447 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.574407 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-bound-sa-token\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:02.574447 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.574443 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-image-registry-private-configuration\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:02.574447 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.574454 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v99tr\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-kube-api-access-v99tr\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:02.574742 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:02.574466 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e290ed95-0b8a-4c01-aa97-91f4caed9f63-installation-pull-secrets\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:03.365251 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:03.365216 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-785f97fdcb-rww6s" Apr 24 23:57:03.397335 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:03.397308 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-785f97fdcb-rww6s"] Apr 24 23:57:03.400724 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:03.400697 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-785f97fdcb-rww6s"] Apr 24 23:57:03.484897 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:03.484838 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e290ed95-0b8a-4c01-aa97-91f4caed9f63-registry-tls\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:04.731772 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:04.731737 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e290ed95-0b8a-4c01-aa97-91f4caed9f63" path="/var/lib/kubelet/pods/e290ed95-0b8a-4c01-aa97-91f4caed9f63/volumes" Apr 24 23:57:08.371469 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:08.371439 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-69d9f4f478-2gp9t" Apr 24 23:57:08.517476 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:08.517437 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:57:08.517476 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:08.517481 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:57:08.522448 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:08.522423 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:57:09.386206 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:09.386179 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:57:09.433381 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:09.433349 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5474594f84-8bcfv"] Apr 24 23:57:12.470447 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:12.470403 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:57:12.470928 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:12.470478 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:57:23.201318 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.201252 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f79d67b9c-9vgk6" podUID="0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" containerName="console" containerID="cri-o://006a79a4e0cfb9aed1c7aa3cf01f0dd2e9e76b1fb37ab90bb802ed2f7846d0dd" gracePeriod=15 Apr 24 23:57:23.423065 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.421849 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f79d67b9c-9vgk6_0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f/console/0.log" Apr 24 23:57:23.423065 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.421897 2566 generic.go:358] "Generic (PLEG): container finished" podID="0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" containerID="006a79a4e0cfb9aed1c7aa3cf01f0dd2e9e76b1fb37ab90bb802ed2f7846d0dd" exitCode=2 Apr 24 23:57:23.423065 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.421974 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f79d67b9c-9vgk6" event={"ID":"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f","Type":"ContainerDied","Data":"006a79a4e0cfb9aed1c7aa3cf01f0dd2e9e76b1fb37ab90bb802ed2f7846d0dd"} Apr 24 23:57:23.461697 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.461644 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f79d67b9c-9vgk6_0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f/console/0.log" Apr 24 23:57:23.461806 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.461701 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:57:23.563310 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563279 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-oauth-config\") pod \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " Apr 24 23:57:23.563472 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563324 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-serving-cert\") pod \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " Apr 24 23:57:23.563472 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563348 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-oauth-serving-cert\") pod \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " Apr 24 23:57:23.563472 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563405 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-config\") pod \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " Apr 24 23:57:23.563472 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563433 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-service-ca\") pod \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " Apr 24 23:57:23.563706 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563479 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f57vv\" (UniqueName: \"kubernetes.io/projected/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-kube-api-access-f57vv\") pod \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\" (UID: \"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f\") " Apr 24 23:57:23.563861 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563826 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" (UID: "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:23.563861 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563836 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-config" (OuterVolumeSpecName: "console-config") pod "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" (UID: "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:23.563973 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.563879 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-service-ca" (OuterVolumeSpecName: "service-ca") pod "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" (UID: "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:23.565581 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.565540 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" (UID: "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:23.565698 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.565600 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" (UID: "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:23.565776 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.565749 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-kube-api-access-f57vv" (OuterVolumeSpecName: "kube-api-access-f57vv") pod "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" (UID: "0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f"). InnerVolumeSpecName "kube-api-access-f57vv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:23.664424 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.664385 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-oauth-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:23.664424 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.664417 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:23.664424 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.664430 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-oauth-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:23.664699 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.664444 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-console-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:23.664699 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.664458 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-service-ca\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:23.664699 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:23.664469 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f57vv\" (UniqueName: \"kubernetes.io/projected/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f-kube-api-access-f57vv\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:24.425784 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:24.425757 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f79d67b9c-9vgk6_0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f/console/0.log" Apr 24 23:57:24.426162 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:24.425833 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f79d67b9c-9vgk6" event={"ID":"0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f","Type":"ContainerDied","Data":"a0da1b1c242dbe1924f4b799487ac377f8d74f3fcd406bc3008f66232b17f19d"} Apr 24 23:57:24.426162 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:24.425871 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f79d67b9c-9vgk6" Apr 24 23:57:24.426162 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:24.425871 2566 scope.go:117] "RemoveContainer" containerID="006a79a4e0cfb9aed1c7aa3cf01f0dd2e9e76b1fb37ab90bb802ed2f7846d0dd" Apr 24 23:57:24.446226 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:24.446201 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f79d67b9c-9vgk6"] Apr 24 23:57:24.452354 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:24.452332 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f79d67b9c-9vgk6"] Apr 24 23:57:24.729958 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:24.729926 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" path="/var/lib/kubelet/pods/0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f/volumes" Apr 24 23:57:32.475345 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:32.475315 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:57:32.479411 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:32.479392 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8569fb9b5c-8mzsg" Apr 24 23:57:33.453660 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:33.453627 2566 generic.go:358] "Generic (PLEG): container finished" podID="1d003339-504a-4e95-aba4-a47bafe0f0d6" containerID="76fc40c60e0fd5114b8c63a9caae37b3a6bfd9a879da9ef109f850742d23b97d" exitCode=0 Apr 24 23:57:33.453836 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:33.453700 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5tx5" event={"ID":"1d003339-504a-4e95-aba4-a47bafe0f0d6","Type":"ContainerDied","Data":"76fc40c60e0fd5114b8c63a9caae37b3a6bfd9a879da9ef109f850742d23b97d"} Apr 24 23:57:33.454228 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:33.454210 2566 scope.go:117] "RemoveContainer" containerID="76fc40c60e0fd5114b8c63a9caae37b3a6bfd9a879da9ef109f850742d23b97d" Apr 24 23:57:34.451885 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.451844 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5474594f84-8bcfv" podUID="47eefd6c-a017-4a64-858e-0b60d8c07db9" containerName="console" containerID="cri-o://77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781" gracePeriod=15 Apr 24 23:57:34.458216 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.458191 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5tx5" event={"ID":"1d003339-504a-4e95-aba4-a47bafe0f0d6","Type":"ContainerStarted","Data":"951018cf2df9548ef84c4d316806dc45a2ab437817b08ea5f4bf1678996d71db"} Apr 24 23:57:34.717237 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.717211 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tdkpx_07e0eebb-8365-490f-b2b2-16f26075fac7/dns-node-resolver/0.log" Apr 24 23:57:34.717915 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.717897 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5474594f84-8bcfv_47eefd6c-a017-4a64-858e-0b60d8c07db9/console/0.log" Apr 24 23:57:34.718012 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.717951 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:57:34.863818 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.863786 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-serving-cert\") pod \"47eefd6c-a017-4a64-858e-0b60d8c07db9\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " Apr 24 23:57:34.863818 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.863826 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-oauth-config\") pod \"47eefd6c-a017-4a64-858e-0b60d8c07db9\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " Apr 24 23:57:34.864090 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.863889 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-service-ca\") pod \"47eefd6c-a017-4a64-858e-0b60d8c07db9\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " Apr 24 23:57:34.864090 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.863922 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-trusted-ca-bundle\") pod \"47eefd6c-a017-4a64-858e-0b60d8c07db9\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " Apr 24 23:57:34.864090 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.863943 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-oauth-serving-cert\") pod \"47eefd6c-a017-4a64-858e-0b60d8c07db9\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " Apr 24 23:57:34.864090 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.863969 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-config\") pod \"47eefd6c-a017-4a64-858e-0b60d8c07db9\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " Apr 24 23:57:34.864090 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.864012 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptrt4\" (UniqueName: \"kubernetes.io/projected/47eefd6c-a017-4a64-858e-0b60d8c07db9-kube-api-access-ptrt4\") pod \"47eefd6c-a017-4a64-858e-0b60d8c07db9\" (UID: \"47eefd6c-a017-4a64-858e-0b60d8c07db9\") " Apr 24 23:57:34.864368 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.864334 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-service-ca" (OuterVolumeSpecName: "service-ca") pod "47eefd6c-a017-4a64-858e-0b60d8c07db9" (UID: "47eefd6c-a017-4a64-858e-0b60d8c07db9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:34.864428 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.864341 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "47eefd6c-a017-4a64-858e-0b60d8c07db9" (UID: "47eefd6c-a017-4a64-858e-0b60d8c07db9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:34.864586 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.864447 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-config" (OuterVolumeSpecName: "console-config") pod "47eefd6c-a017-4a64-858e-0b60d8c07db9" (UID: "47eefd6c-a017-4a64-858e-0b60d8c07db9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:34.864710 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.864682 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "47eefd6c-a017-4a64-858e-0b60d8c07db9" (UID: "47eefd6c-a017-4a64-858e-0b60d8c07db9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:34.866243 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.866222 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47eefd6c-a017-4a64-858e-0b60d8c07db9-kube-api-access-ptrt4" (OuterVolumeSpecName: "kube-api-access-ptrt4") pod "47eefd6c-a017-4a64-858e-0b60d8c07db9" (UID: "47eefd6c-a017-4a64-858e-0b60d8c07db9"). InnerVolumeSpecName "kube-api-access-ptrt4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:34.866510 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.866484 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "47eefd6c-a017-4a64-858e-0b60d8c07db9" (UID: "47eefd6c-a017-4a64-858e-0b60d8c07db9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:34.866643 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.866510 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "47eefd6c-a017-4a64-858e-0b60d8c07db9" (UID: "47eefd6c-a017-4a64-858e-0b60d8c07db9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:34.965036 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.964959 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-oauth-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:34.965036 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.964985 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-service-ca\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:34.965036 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.964996 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-trusted-ca-bundle\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:34.965036 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.965006 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-oauth-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:34.965036 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.965016 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:34.965036 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.965024 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptrt4\" (UniqueName: \"kubernetes.io/projected/47eefd6c-a017-4a64-858e-0b60d8c07db9-kube-api-access-ptrt4\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:34.965036 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:34.965034 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47eefd6c-a017-4a64-858e-0b60d8c07db9-console-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:57:35.462512 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.462490 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5474594f84-8bcfv_47eefd6c-a017-4a64-858e-0b60d8c07db9/console/0.log" Apr 24 23:57:35.462917 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.462527 2566 generic.go:358] "Generic (PLEG): container finished" podID="47eefd6c-a017-4a64-858e-0b60d8c07db9" containerID="77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781" exitCode=2 Apr 24 23:57:35.462917 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.462561 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5474594f84-8bcfv" event={"ID":"47eefd6c-a017-4a64-858e-0b60d8c07db9","Type":"ContainerDied","Data":"77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781"} Apr 24 23:57:35.462917 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.462604 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5474594f84-8bcfv" Apr 24 23:57:35.462917 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.462621 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5474594f84-8bcfv" event={"ID":"47eefd6c-a017-4a64-858e-0b60d8c07db9","Type":"ContainerDied","Data":"169569f5562cb44e135bdec064e313858d50512415057cea963a440c12728ed5"} Apr 24 23:57:35.462917 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.462640 2566 scope.go:117] "RemoveContainer" containerID="77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781" Apr 24 23:57:35.471032 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.471017 2566 scope.go:117] "RemoveContainer" containerID="77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781" Apr 24 23:57:35.471294 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:57:35.471266 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781\": container with ID starting with 77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781 not found: ID does not exist" containerID="77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781" Apr 24 23:57:35.471354 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.471303 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781"} err="failed to get container status \"77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781\": rpc error: code = NotFound desc = could not find container \"77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781\": container with ID starting with 77c0b829eef5965149b6e2461fad3795f3fc3bcd533f8c53f9a020910a423781 not found: ID does not exist" Apr 24 23:57:35.483449 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.483427 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5474594f84-8bcfv"] Apr 24 23:57:35.488358 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:35.488340 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5474594f84-8bcfv"] Apr 24 23:57:36.730779 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:36.730746 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47eefd6c-a017-4a64-858e-0b60d8c07db9" path="/var/lib/kubelet/pods/47eefd6c-a017-4a64-858e-0b60d8c07db9/volumes" Apr 24 23:57:46.561047 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:46.561002 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:57:46.564051 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:46.564022 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e101d25b-89b6-4522-8e39-35b94ce4d935-metrics-certs\") pod \"network-metrics-daemon-wf82j\" (UID: \"e101d25b-89b6-4522-8e39-35b94ce4d935\") " pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:57:46.630872 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:46.630843 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qtxb2\"" Apr 24 23:57:46.639156 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:46.639131 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wf82j" Apr 24 23:57:46.794212 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:46.794178 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wf82j"] Apr 24 23:57:46.796987 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:57:46.796957 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode101d25b_89b6_4522_8e39_35b94ce4d935.slice/crio-d507c1ee9e0a41c471dfb2188576c64113bbdb680152a9eade4c3497010c2d6b WatchSource:0}: Error finding container d507c1ee9e0a41c471dfb2188576c64113bbdb680152a9eade4c3497010c2d6b: Status 404 returned error can't find the container with id d507c1ee9e0a41c471dfb2188576c64113bbdb680152a9eade4c3497010c2d6b Apr 24 23:57:47.500624 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:47.500583 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wf82j" event={"ID":"e101d25b-89b6-4522-8e39-35b94ce4d935","Type":"ContainerStarted","Data":"d507c1ee9e0a41c471dfb2188576c64113bbdb680152a9eade4c3497010c2d6b"} Apr 24 23:57:48.504426 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:48.504389 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wf82j" event={"ID":"e101d25b-89b6-4522-8e39-35b94ce4d935","Type":"ContainerStarted","Data":"949523532b6448e1a43ae53fa87289eff77df58ef4e82447c9128fbc1b24fc14"} Apr 24 23:57:48.504426 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:48.504426 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wf82j" event={"ID":"e101d25b-89b6-4522-8e39-35b94ce4d935","Type":"ContainerStarted","Data":"89ea10c3ce9aa19b24c4ca7eb145f9e2b20b354fdba54cc9290e8f3b8f59fc0f"} Apr 24 23:57:48.521385 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:57:48.521325 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wf82j" podStartSLOduration=253.482818398 podStartE2EDuration="4m14.521309811s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:57:46.79884817 +0000 UTC m=+252.663340437" lastFinishedPulling="2026-04-24 23:57:47.837339568 +0000 UTC m=+253.701831850" observedRunningTime="2026-04-24 23:57:48.520062991 +0000 UTC m=+254.384555292" watchObservedRunningTime="2026-04-24 23:57:48.521309811 +0000 UTC m=+254.385802102" Apr 24 23:58:12.008206 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.008166 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g"] Apr 24 23:58:12.008709 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.008511 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" containerName="console" Apr 24 23:58:12.008709 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.008525 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" containerName="console" Apr 24 23:58:12.008709 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.008583 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47eefd6c-a017-4a64-858e-0b60d8c07db9" containerName="console" Apr 24 23:58:12.008709 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.008589 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="47eefd6c-a017-4a64-858e-0b60d8c07db9" containerName="console" Apr 24 23:58:12.008709 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.008641 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a7f4a54-a1e1-42ba-9ffc-f1f8dcb8e49f" containerName="console" Apr 24 23:58:12.008709 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.008650 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="47eefd6c-a017-4a64-858e-0b60d8c07db9" containerName="console" Apr 24 23:58:12.015050 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.015023 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.017551 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.017527 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 23:58:12.017700 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.017556 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 23:58:12.017700 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.017629 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 23:58:12.017700 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.017637 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 23:58:12.017700 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.017556 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-g9jbb\"" Apr 24 23:58:12.017922 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.017633 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 23:58:12.023529 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.023418 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 23:58:12.025366 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.025342 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g"] Apr 24 23:58:12.079552 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.079518 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npr86\" (UniqueName: \"kubernetes.io/projected/db915ee5-9581-4aea-8dd6-0db56a3017b1-kube-api-access-npr86\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.079774 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.079556 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-secret-telemeter-client\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.079774 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.079602 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-telemeter-client-tls\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.079774 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.079687 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-federate-client-tls\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.079774 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.079723 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.079774 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.079761 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-metrics-client-ca\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.079972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.079865 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.079972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.079904 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-serving-certs-ca-bundle\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.181003 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.180966 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.181196 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.181013 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-serving-certs-ca-bundle\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.181196 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.181102 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npr86\" (UniqueName: \"kubernetes.io/projected/db915ee5-9581-4aea-8dd6-0db56a3017b1-kube-api-access-npr86\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.181196 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.181136 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-secret-telemeter-client\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.181196 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.181164 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-telemeter-client-tls\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.181415 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.181200 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-federate-client-tls\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.182496 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.181558 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.182496 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.181630 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-metrics-client-ca\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.182496 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.181896 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-serving-certs-ca-bundle\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.182496 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.182250 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-metrics-client-ca\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.185480 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.183067 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db915ee5-9581-4aea-8dd6-0db56a3017b1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.185480 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.185214 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-telemeter-client-tls\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.191267 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.185782 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.191267 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.186159 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-federate-client-tls\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.191267 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.189762 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npr86\" (UniqueName: \"kubernetes.io/projected/db915ee5-9581-4aea-8dd6-0db56a3017b1-kube-api-access-npr86\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.191267 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.190060 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/db915ee5-9581-4aea-8dd6-0db56a3017b1-secret-telemeter-client\") pod \"telemeter-client-5d46c47bd5-z7f8g\" (UID: \"db915ee5-9581-4aea-8dd6-0db56a3017b1\") " pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.327146 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.327054 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" Apr 24 23:58:12.463508 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.463478 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g"] Apr 24 23:58:12.466052 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:58:12.466024 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb915ee5_9581_4aea_8dd6_0db56a3017b1.slice/crio-3904e8c60e8b488a582f08b1ad1f16f605a993a9f59032dd5ababf299107a743 WatchSource:0}: Error finding container 3904e8c60e8b488a582f08b1ad1f16f605a993a9f59032dd5ababf299107a743: Status 404 returned error can't find the container with id 3904e8c60e8b488a582f08b1ad1f16f605a993a9f59032dd5ababf299107a743 Apr 24 23:58:12.570279 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:12.570201 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" event={"ID":"db915ee5-9581-4aea-8dd6-0db56a3017b1","Type":"ContainerStarted","Data":"3904e8c60e8b488a582f08b1ad1f16f605a993a9f59032dd5ababf299107a743"} Apr 24 23:58:13.166927 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:58:13.166879 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" podUID="1bb33cce-974b-42c1-aafe-f821da1a3f63" Apr 24 23:58:13.572521 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:13.572492 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:58:14.575956 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:14.575924 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" event={"ID":"db915ee5-9581-4aea-8dd6-0db56a3017b1","Type":"ContainerStarted","Data":"ef46265c2d3b8e2ad9d3779a0f9cbc2209bd08cc1c26dfe10f9345d3e748225c"} Apr 24 23:58:15.580459 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:15.580423 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" event={"ID":"db915ee5-9581-4aea-8dd6-0db56a3017b1","Type":"ContainerStarted","Data":"90103fc943bb9431c1ef896537125ab48a5ca3338002c62d4f688dc0a167ac13"} Apr 24 23:58:15.580459 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:15.580460 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" event={"ID":"db915ee5-9581-4aea-8dd6-0db56a3017b1","Type":"ContainerStarted","Data":"42735883bc22eeb214c580a4b44bf44cad671b417e4bfe390ec12e3890c8947d"} Apr 24 23:58:15.607211 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:15.607162 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5d46c47bd5-z7f8g" podStartSLOduration=1.962283635 podStartE2EDuration="4.607147784s" podCreationTimestamp="2026-04-24 23:58:11 +0000 UTC" firstStartedPulling="2026-04-24 23:58:12.468236761 +0000 UTC m=+278.332729030" lastFinishedPulling="2026-04-24 23:58:15.113100913 +0000 UTC m=+280.977593179" observedRunningTime="2026-04-24 23:58:15.604969187 +0000 UTC m=+281.469461475" watchObservedRunningTime="2026-04-24 23:58:15.607147784 +0000 UTC m=+281.471640072" Apr 24 23:58:16.235893 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.235858 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-849bb67d76-w9mgv"] Apr 24 23:58:16.239282 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.239256 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.248471 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.248439 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-849bb67d76-w9mgv"] Apr 24 23:58:16.320586 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.320536 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-trusted-ca-bundle\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.320586 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.320592 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4wjp\" (UniqueName: \"kubernetes.io/projected/99124769-77cb-4b30-9c98-f9c6be46a1cb-kube-api-access-r4wjp\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.320800 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.320680 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-serving-cert\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.320800 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.320735 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-oauth-config\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.320800 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.320763 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-oauth-serving-cert\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.320800 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.320794 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-service-ca\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.320944 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.320823 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-config\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422006 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.421953 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-oauth-config\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422006 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.422016 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-oauth-serving-cert\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422206 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.422066 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-service-ca\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422206 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.422107 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-config\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422206 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.422170 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-trusted-ca-bundle\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422206 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.422191 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4wjp\" (UniqueName: \"kubernetes.io/projected/99124769-77cb-4b30-9c98-f9c6be46a1cb-kube-api-access-r4wjp\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422382 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.422247 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-serving-cert\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422996 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.422969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-service-ca\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.422996 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.422987 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-oauth-serving-cert\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.423145 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.423033 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-config\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.423181 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.423139 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-trusted-ca-bundle\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.424556 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.424529 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-oauth-config\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.424811 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.424791 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-serving-cert\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.430590 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.430548 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4wjp\" (UniqueName: \"kubernetes.io/projected/99124769-77cb-4b30-9c98-f9c6be46a1cb-kube-api-access-r4wjp\") pod \"console-849bb67d76-w9mgv\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.550551 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.550461 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:16.668754 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:16.668729 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-849bb67d76-w9mgv"] Apr 24 23:58:16.670530 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:58:16.670497 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99124769_77cb_4b30_9c98_f9c6be46a1cb.slice/crio-174f04df21f8695e03c57a9d8e11ebcc5c1f5aa4f5eeb2188184db7c1b12fcb2 WatchSource:0}: Error finding container 174f04df21f8695e03c57a9d8e11ebcc5c1f5aa4f5eeb2188184db7c1b12fcb2: Status 404 returned error can't find the container with id 174f04df21f8695e03c57a9d8e11ebcc5c1f5aa4f5eeb2188184db7c1b12fcb2 Apr 24 23:58:17.028247 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.028194 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:58:17.030595 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.030561 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1bb33cce-974b-42c1-aafe-f821da1a3f63-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t7llw\" (UID: \"1bb33cce-974b-42c1-aafe-f821da1a3f63\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:58:17.128633 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.128590 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:58:17.128729 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.128693 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:58:17.130904 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.130878 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f-metrics-tls\") pod \"dns-default-xcntc\" (UID: \"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f\") " pod="openshift-dns/dns-default-xcntc" Apr 24 23:58:17.131013 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.130928 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe18020-a109-4021-a4a7-567311f209f4-cert\") pod \"ingress-canary-v7wqx\" (UID: \"7fe18020-a109-4021-a4a7-567311f209f4\") " pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:58:17.175091 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.175064 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-jxmc2\"" Apr 24 23:58:17.183234 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.183214 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" Apr 24 23:58:17.230635 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.230608 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-drk7t\"" Apr 24 23:58:17.238709 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.238683 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcntc" Apr 24 23:58:17.302890 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.302848 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-t7llw"] Apr 24 23:58:17.305324 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:58:17.305293 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bb33cce_974b_42c1_aafe_f821da1a3f63.slice/crio-3ad08d866ab67044fa97e6ddc0314c3348afc00c64d3c91772de6e7816f41efd WatchSource:0}: Error finding container 3ad08d866ab67044fa97e6ddc0314c3348afc00c64d3c91772de6e7816f41efd: Status 404 returned error can't find the container with id 3ad08d866ab67044fa97e6ddc0314c3348afc00c64d3c91772de6e7816f41efd Apr 24 23:58:17.361180 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.361133 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xcntc"] Apr 24 23:58:17.363202 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:58:17.363175 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5ab5f2_8430_48bf_b20f_c8e1fa32c31f.slice/crio-8e93d6096b380671e373ad7a11cc05441cb5f3bca39a855e94af6a36657ed582 WatchSource:0}: Error finding container 8e93d6096b380671e373ad7a11cc05441cb5f3bca39a855e94af6a36657ed582: Status 404 returned error can't find the container with id 8e93d6096b380671e373ad7a11cc05441cb5f3bca39a855e94af6a36657ed582 Apr 24 23:58:17.430088 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.430065 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gqd2q\"" Apr 24 23:58:17.455586 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.454633 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v7wqx" Apr 24 23:58:17.571733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.571653 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v7wqx"] Apr 24 23:58:17.574292 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:58:17.574260 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fe18020_a109_4021_a4a7_567311f209f4.slice/crio-dccf7e582f4eec7fefbf62693ef286a970f2de2a0acc2c1f314399b2f914dc6c WatchSource:0}: Error finding container dccf7e582f4eec7fefbf62693ef286a970f2de2a0acc2c1f314399b2f914dc6c: Status 404 returned error can't find the container with id dccf7e582f4eec7fefbf62693ef286a970f2de2a0acc2c1f314399b2f914dc6c Apr 24 23:58:17.588337 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.588309 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v7wqx" event={"ID":"7fe18020-a109-4021-a4a7-567311f209f4","Type":"ContainerStarted","Data":"dccf7e582f4eec7fefbf62693ef286a970f2de2a0acc2c1f314399b2f914dc6c"} Apr 24 23:58:17.589926 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.589902 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849bb67d76-w9mgv" event={"ID":"99124769-77cb-4b30-9c98-f9c6be46a1cb","Type":"ContainerStarted","Data":"312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3"} Apr 24 23:58:17.590025 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.589935 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849bb67d76-w9mgv" event={"ID":"99124769-77cb-4b30-9c98-f9c6be46a1cb","Type":"ContainerStarted","Data":"174f04df21f8695e03c57a9d8e11ebcc5c1f5aa4f5eeb2188184db7c1b12fcb2"} Apr 24 23:58:17.590918 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.590894 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcntc" event={"ID":"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f","Type":"ContainerStarted","Data":"8e93d6096b380671e373ad7a11cc05441cb5f3bca39a855e94af6a36657ed582"} Apr 24 23:58:17.591851 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.591833 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" event={"ID":"1bb33cce-974b-42c1-aafe-f821da1a3f63","Type":"ContainerStarted","Data":"3ad08d866ab67044fa97e6ddc0314c3348afc00c64d3c91772de6e7816f41efd"} Apr 24 23:58:17.605550 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:17.605513 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-849bb67d76-w9mgv" podStartSLOduration=1.605502234 podStartE2EDuration="1.605502234s" podCreationTimestamp="2026-04-24 23:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:58:17.605154453 +0000 UTC m=+283.469646742" watchObservedRunningTime="2026-04-24 23:58:17.605502234 +0000 UTC m=+283.469994522" Apr 24 23:58:19.601974 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:19.601935 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v7wqx" event={"ID":"7fe18020-a109-4021-a4a7-567311f209f4","Type":"ContainerStarted","Data":"97faef8b17f58671883d978e89c99353f20b528d7604bce95efa0efa976d8201"} Apr 24 23:58:19.603597 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:19.603545 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcntc" event={"ID":"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f","Type":"ContainerStarted","Data":"d9c212d02b3161145d0cf06ef7aab9da28eed4d27006ed4c347da488d15d0ab9"} Apr 24 23:58:19.605052 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:19.605021 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" event={"ID":"1bb33cce-974b-42c1-aafe-f821da1a3f63","Type":"ContainerStarted","Data":"266a33db108d93c9fec142d72cd9dd3be5fefa237ad2905521f501fc46e95138"} Apr 24 23:58:19.619627 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:19.619545 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v7wqx" podStartSLOduration=250.78741066 podStartE2EDuration="4m12.619528035s" podCreationTimestamp="2026-04-24 23:54:07 +0000 UTC" firstStartedPulling="2026-04-24 23:58:17.576346387 +0000 UTC m=+283.440838654" lastFinishedPulling="2026-04-24 23:58:19.408463763 +0000 UTC m=+285.272956029" observedRunningTime="2026-04-24 23:58:19.618179143 +0000 UTC m=+285.482671432" watchObservedRunningTime="2026-04-24 23:58:19.619528035 +0000 UTC m=+285.484020324" Apr 24 23:58:19.634930 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:19.634877 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t7llw" podStartSLOduration=283.538398614 podStartE2EDuration="4m45.634859111s" podCreationTimestamp="2026-04-24 23:53:34 +0000 UTC" firstStartedPulling="2026-04-24 23:58:17.307742806 +0000 UTC m=+283.172235072" lastFinishedPulling="2026-04-24 23:58:19.404203297 +0000 UTC m=+285.268695569" observedRunningTime="2026-04-24 23:58:19.632428669 +0000 UTC m=+285.496920966" watchObservedRunningTime="2026-04-24 23:58:19.634859111 +0000 UTC m=+285.499351400" Apr 24 23:58:20.609900 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:20.609851 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcntc" event={"ID":"6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f","Type":"ContainerStarted","Data":"b02b5d9b8c56ec7bede5fb7f23b09bf1d6b08a9df766600b76116274237492b5"} Apr 24 23:58:20.610347 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:20.610086 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xcntc" Apr 24 23:58:20.628119 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:20.628077 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xcntc" podStartSLOduration=251.588325483 podStartE2EDuration="4m13.628065016s" podCreationTimestamp="2026-04-24 23:54:07 +0000 UTC" firstStartedPulling="2026-04-24 23:58:17.365220419 +0000 UTC m=+283.229712685" lastFinishedPulling="2026-04-24 23:58:19.404959952 +0000 UTC m=+285.269452218" observedRunningTime="2026-04-24 23:58:20.627098281 +0000 UTC m=+286.491590570" watchObservedRunningTime="2026-04-24 23:58:20.628065016 +0000 UTC m=+286.492557303" Apr 24 23:58:26.551187 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:26.551146 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:26.551187 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:26.551195 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:26.555841 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:26.555821 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:26.632135 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:26.632110 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:58:26.680295 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:26.680263 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68f7dc7566-rp2x4"] Apr 24 23:58:30.615742 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:30.615702 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xcntc" Apr 24 23:58:34.622135 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:34.622112 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 24 23:58:34.622496 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:34.622234 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 24 23:58:34.626080 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:34.626065 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:58:51.705076 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:51.705036 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68f7dc7566-rp2x4" podUID="71d8cfb9-113b-476c-81b2-993b02ee81f3" containerName="console" containerID="cri-o://6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464" gracePeriod=15 Apr 24 23:58:51.942973 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:51.942951 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f7dc7566-rp2x4_71d8cfb9-113b-476c-81b2-993b02ee81f3/console/0.log" Apr 24 23:58:51.943089 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:51.943011 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:58:52.022218 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022128 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-oauth-config\") pod \"71d8cfb9-113b-476c-81b2-993b02ee81f3\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " Apr 24 23:58:52.022218 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022168 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-serving-cert\") pod \"71d8cfb9-113b-476c-81b2-993b02ee81f3\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " Apr 24 23:58:52.022218 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022189 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-service-ca\") pod \"71d8cfb9-113b-476c-81b2-993b02ee81f3\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " Apr 24 23:58:52.022485 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022233 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-oauth-serving-cert\") pod \"71d8cfb9-113b-476c-81b2-993b02ee81f3\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " Apr 24 23:58:52.022485 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022274 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-trusted-ca-bundle\") pod \"71d8cfb9-113b-476c-81b2-993b02ee81f3\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " Apr 24 23:58:52.022485 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022305 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97mm4\" (UniqueName: \"kubernetes.io/projected/71d8cfb9-113b-476c-81b2-993b02ee81f3-kube-api-access-97mm4\") pod \"71d8cfb9-113b-476c-81b2-993b02ee81f3\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " Apr 24 23:58:52.022485 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022328 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-config\") pod \"71d8cfb9-113b-476c-81b2-993b02ee81f3\" (UID: \"71d8cfb9-113b-476c-81b2-993b02ee81f3\") " Apr 24 23:58:52.022720 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022661 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-service-ca" (OuterVolumeSpecName: "service-ca") pod "71d8cfb9-113b-476c-81b2-993b02ee81f3" (UID: "71d8cfb9-113b-476c-81b2-993b02ee81f3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:52.022720 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022687 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "71d8cfb9-113b-476c-81b2-993b02ee81f3" (UID: "71d8cfb9-113b-476c-81b2-993b02ee81f3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:52.022843 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022805 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-config" (OuterVolumeSpecName: "console-config") pod "71d8cfb9-113b-476c-81b2-993b02ee81f3" (UID: "71d8cfb9-113b-476c-81b2-993b02ee81f3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:52.022843 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.022815 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "71d8cfb9-113b-476c-81b2-993b02ee81f3" (UID: "71d8cfb9-113b-476c-81b2-993b02ee81f3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:52.023070 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.023041 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-service-ca\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:58:52.023143 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.023069 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-oauth-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:58:52.023143 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.023086 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-trusted-ca-bundle\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:58:52.023143 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.023102 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:58:52.024440 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.024412 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "71d8cfb9-113b-476c-81b2-993b02ee81f3" (UID: "71d8cfb9-113b-476c-81b2-993b02ee81f3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:52.024440 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.024431 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "71d8cfb9-113b-476c-81b2-993b02ee81f3" (UID: "71d8cfb9-113b-476c-81b2-993b02ee81f3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:52.024614 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.024459 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d8cfb9-113b-476c-81b2-993b02ee81f3-kube-api-access-97mm4" (OuterVolumeSpecName: "kube-api-access-97mm4") pod "71d8cfb9-113b-476c-81b2-993b02ee81f3" (UID: "71d8cfb9-113b-476c-81b2-993b02ee81f3"). InnerVolumeSpecName "kube-api-access-97mm4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:58:52.123451 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.123419 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-97mm4\" (UniqueName: \"kubernetes.io/projected/71d8cfb9-113b-476c-81b2-993b02ee81f3-kube-api-access-97mm4\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:58:52.123451 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.123458 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-oauth-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:58:52.123671 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.123468 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71d8cfb9-113b-476c-81b2-993b02ee81f3-console-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:58:52.703491 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.703464 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f7dc7566-rp2x4_71d8cfb9-113b-476c-81b2-993b02ee81f3/console/0.log" Apr 24 23:58:52.703676 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.703506 2566 generic.go:358] "Generic (PLEG): container finished" podID="71d8cfb9-113b-476c-81b2-993b02ee81f3" containerID="6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464" exitCode=2 Apr 24 23:58:52.703676 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.703597 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f7dc7566-rp2x4" event={"ID":"71d8cfb9-113b-476c-81b2-993b02ee81f3","Type":"ContainerDied","Data":"6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464"} Apr 24 23:58:52.703676 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.703619 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f7dc7566-rp2x4" Apr 24 23:58:52.703676 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.703641 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f7dc7566-rp2x4" event={"ID":"71d8cfb9-113b-476c-81b2-993b02ee81f3","Type":"ContainerDied","Data":"1968e188655dbe1aa41b7201ec823f592433496b3afacb81d85dc1cc2b503ed5"} Apr 24 23:58:52.703676 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.703657 2566 scope.go:117] "RemoveContainer" containerID="6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464" Apr 24 23:58:52.711817 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.711444 2566 scope.go:117] "RemoveContainer" containerID="6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464" Apr 24 23:58:52.711817 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:58:52.711758 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464\": container with ID starting with 6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464 not found: ID does not exist" containerID="6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464" Apr 24 23:58:52.711817 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.711791 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464"} err="failed to get container status \"6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464\": rpc error: code = NotFound desc = could not find container \"6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464\": container with ID starting with 6bfa1fe681d4afdc9fea636c7a8fc1cda56e54c769c29b025c34b519cf9f5464 not found: ID does not exist" Apr 24 23:58:52.723888 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.723863 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68f7dc7566-rp2x4"] Apr 24 23:58:52.729789 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:52.729771 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68f7dc7566-rp2x4"] Apr 24 23:58:54.730561 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:58:54.730527 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d8cfb9-113b-476c-81b2-993b02ee81f3" path="/var/lib/kubelet/pods/71d8cfb9-113b-476c-81b2-993b02ee81f3/volumes" Apr 24 23:59:23.323690 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.323612 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77bfd744f-6nzkj"] Apr 24 23:59:23.324155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.323909 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71d8cfb9-113b-476c-81b2-993b02ee81f3" containerName="console" Apr 24 23:59:23.324155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.323919 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d8cfb9-113b-476c-81b2-993b02ee81f3" containerName="console" Apr 24 23:59:23.324155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.323979 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="71d8cfb9-113b-476c-81b2-993b02ee81f3" containerName="console" Apr 24 23:59:23.326794 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.326778 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.336400 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.336373 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77bfd744f-6nzkj"] Apr 24 23:59:23.378142 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.378109 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-trusted-ca-bundle\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.378142 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.378143 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-oauth-serving-cert\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.378342 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.378172 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-oauth-config\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.378342 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.378246 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-config\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.378342 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.378273 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-service-ca\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.378342 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.378298 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68fgj\" (UniqueName: \"kubernetes.io/projected/3f2e2a06-e786-4f1b-9810-c246aa9459ff-kube-api-access-68fgj\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.378342 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.378317 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-serving-cert\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.479480 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.479447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-trusted-ca-bundle\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.479480 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.479481 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-oauth-serving-cert\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.479733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.479509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-oauth-config\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.479733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.479540 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-config\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.479733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.479561 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-service-ca\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.479733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.479613 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68fgj\" (UniqueName: \"kubernetes.io/projected/3f2e2a06-e786-4f1b-9810-c246aa9459ff-kube-api-access-68fgj\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.479733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.479642 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-serving-cert\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.480314 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.480284 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-oauth-serving-cert\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.480434 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.480331 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-config\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.480434 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.480385 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-trusted-ca-bundle\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.480553 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.480443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-service-ca\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.482037 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.482009 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-serving-cert\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.482151 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.482103 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-oauth-config\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.490303 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.490280 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68fgj\" (UniqueName: \"kubernetes.io/projected/3f2e2a06-e786-4f1b-9810-c246aa9459ff-kube-api-access-68fgj\") pod \"console-77bfd744f-6nzkj\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.636972 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.636898 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:23.754107 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.754082 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77bfd744f-6nzkj"] Apr 24 23:59:23.756590 ip-10-0-140-130 kubenswrapper[2566]: W0424 23:59:23.756544 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f2e2a06_e786_4f1b_9810_c246aa9459ff.slice/crio-29101c1b35dd9408b5c78157df9dce98ad200c20b5cecf246ed34be55c570aee WatchSource:0}: Error finding container 29101c1b35dd9408b5c78157df9dce98ad200c20b5cecf246ed34be55c570aee: Status 404 returned error can't find the container with id 29101c1b35dd9408b5c78157df9dce98ad200c20b5cecf246ed34be55c570aee Apr 24 23:59:23.758280 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.758266 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:59:23.793349 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:23.793317 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77bfd744f-6nzkj" event={"ID":"3f2e2a06-e786-4f1b-9810-c246aa9459ff","Type":"ContainerStarted","Data":"29101c1b35dd9408b5c78157df9dce98ad200c20b5cecf246ed34be55c570aee"} Apr 24 23:59:24.797733 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:24.797696 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77bfd744f-6nzkj" event={"ID":"3f2e2a06-e786-4f1b-9810-c246aa9459ff","Type":"ContainerStarted","Data":"68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05"} Apr 24 23:59:24.815390 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:24.815343 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77bfd744f-6nzkj" podStartSLOduration=1.815327752 podStartE2EDuration="1.815327752s" podCreationTimestamp="2026-04-24 23:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:59:24.814443263 +0000 UTC m=+350.678935563" watchObservedRunningTime="2026-04-24 23:59:24.815327752 +0000 UTC m=+350.679820042" Apr 24 23:59:33.637580 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:33.637543 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:33.638027 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:33.637605 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:33.642217 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:33.642194 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:33.831500 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:33.831471 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77bfd744f-6nzkj" Apr 24 23:59:33.881070 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:33.881026 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-849bb67d76-w9mgv"] Apr 24 23:59:58.906422 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:58.906365 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-849bb67d76-w9mgv" podUID="99124769-77cb-4b30-9c98-f9c6be46a1cb" containerName="console" containerID="cri-o://312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3" gracePeriod=15 Apr 24 23:59:59.148457 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.148435 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-849bb67d76-w9mgv_99124769-77cb-4b30-9c98-f9c6be46a1cb/console/0.log" Apr 24 23:59:59.148592 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.148496 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:59:59.164602 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.164533 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-oauth-config\") pod \"99124769-77cb-4b30-9c98-f9c6be46a1cb\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " Apr 24 23:59:59.164602 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.164584 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-oauth-serving-cert\") pod \"99124769-77cb-4b30-9c98-f9c6be46a1cb\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " Apr 24 23:59:59.164757 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.164652 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-service-ca\") pod \"99124769-77cb-4b30-9c98-f9c6be46a1cb\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " Apr 24 23:59:59.164757 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.164685 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-config\") pod \"99124769-77cb-4b30-9c98-f9c6be46a1cb\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " Apr 24 23:59:59.164757 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.164715 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-trusted-ca-bundle\") pod \"99124769-77cb-4b30-9c98-f9c6be46a1cb\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " Apr 24 23:59:59.164757 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.164745 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4wjp\" (UniqueName: \"kubernetes.io/projected/99124769-77cb-4b30-9c98-f9c6be46a1cb-kube-api-access-r4wjp\") pod \"99124769-77cb-4b30-9c98-f9c6be46a1cb\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " Apr 24 23:59:59.164954 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.164774 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-serving-cert\") pod \"99124769-77cb-4b30-9c98-f9c6be46a1cb\" (UID: \"99124769-77cb-4b30-9c98-f9c6be46a1cb\") " Apr 24 23:59:59.165153 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.165124 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "99124769-77cb-4b30-9c98-f9c6be46a1cb" (UID: "99124769-77cb-4b30-9c98-f9c6be46a1cb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:59:59.165294 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.165263 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-config" (OuterVolumeSpecName: "console-config") pod "99124769-77cb-4b30-9c98-f9c6be46a1cb" (UID: "99124769-77cb-4b30-9c98-f9c6be46a1cb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:59:59.165294 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.165272 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-service-ca" (OuterVolumeSpecName: "service-ca") pod "99124769-77cb-4b30-9c98-f9c6be46a1cb" (UID: "99124769-77cb-4b30-9c98-f9c6be46a1cb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:59:59.165294 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.165281 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "99124769-77cb-4b30-9c98-f9c6be46a1cb" (UID: "99124769-77cb-4b30-9c98-f9c6be46a1cb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:59:59.167111 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.167082 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99124769-77cb-4b30-9c98-f9c6be46a1cb-kube-api-access-r4wjp" (OuterVolumeSpecName: "kube-api-access-r4wjp") pod "99124769-77cb-4b30-9c98-f9c6be46a1cb" (UID: "99124769-77cb-4b30-9c98-f9c6be46a1cb"). InnerVolumeSpecName "kube-api-access-r4wjp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:59.167388 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.167335 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "99124769-77cb-4b30-9c98-f9c6be46a1cb" (UID: "99124769-77cb-4b30-9c98-f9c6be46a1cb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:59:59.167484 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.167358 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "99124769-77cb-4b30-9c98-f9c6be46a1cb" (UID: "99124769-77cb-4b30-9c98-f9c6be46a1cb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:59:59.265937 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.265906 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-service-ca\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:59:59.265937 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.265932 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:59:59.265937 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.265941 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-trusted-ca-bundle\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:59:59.266155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.265950 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4wjp\" (UniqueName: \"kubernetes.io/projected/99124769-77cb-4b30-9c98-f9c6be46a1cb-kube-api-access-r4wjp\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:59:59.266155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.265960 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:59:59.266155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.265968 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99124769-77cb-4b30-9c98-f9c6be46a1cb-console-oauth-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:59:59.266155 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.265977 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99124769-77cb-4b30-9c98-f9c6be46a1cb-oauth-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 24 23:59:59.901539 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.901514 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-849bb67d76-w9mgv_99124769-77cb-4b30-9c98-f9c6be46a1cb/console/0.log" Apr 24 23:59:59.901759 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.901556 2566 generic.go:358] "Generic (PLEG): container finished" podID="99124769-77cb-4b30-9c98-f9c6be46a1cb" containerID="312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3" exitCode=2 Apr 24 23:59:59.901759 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.901606 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849bb67d76-w9mgv" event={"ID":"99124769-77cb-4b30-9c98-f9c6be46a1cb","Type":"ContainerDied","Data":"312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3"} Apr 24 23:59:59.901759 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.901650 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849bb67d76-w9mgv" event={"ID":"99124769-77cb-4b30-9c98-f9c6be46a1cb","Type":"ContainerDied","Data":"174f04df21f8695e03c57a9d8e11ebcc5c1f5aa4f5eeb2188184db7c1b12fcb2"} Apr 24 23:59:59.901759 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.901649 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849bb67d76-w9mgv" Apr 24 23:59:59.901759 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.901663 2566 scope.go:117] "RemoveContainer" containerID="312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3" Apr 24 23:59:59.910940 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.910745 2566 scope.go:117] "RemoveContainer" containerID="312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3" Apr 24 23:59:59.911226 ip-10-0-140-130 kubenswrapper[2566]: E0424 23:59:59.911123 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3\": container with ID starting with 312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3 not found: ID does not exist" containerID="312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3" Apr 24 23:59:59.911226 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.911160 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3"} err="failed to get container status \"312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3\": rpc error: code = NotFound desc = could not find container \"312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3\": container with ID starting with 312c750220c064198f6dfae45f66ac4254b124219c9622890dc7f3472c42e4b3 not found: ID does not exist" Apr 24 23:59:59.926431 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.926402 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-849bb67d76-w9mgv"] Apr 24 23:59:59.929951 ip-10-0-140-130 kubenswrapper[2566]: I0424 23:59:59.929927 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-849bb67d76-w9mgv"] Apr 25 00:00:00.152920 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.152823 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29617920-629pq"] Apr 25 00:00:00.153153 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.153140 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99124769-77cb-4b30-9c98-f9c6be46a1cb" containerName="console" Apr 25 00:00:00.153196 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.153155 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="99124769-77cb-4b30-9c98-f9c6be46a1cb" containerName="console" Apr 25 00:00:00.153229 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.153213 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="99124769-77cb-4b30-9c98-f9c6be46a1cb" containerName="console" Apr 25 00:00:00.157451 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.157434 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:00.160074 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.160044 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Apr 25 00:00:00.160236 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.160144 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"pruner-dockercfg-b56vf\"" Apr 25 00:00:00.165948 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.165926 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29617920-629pq"] Apr 25 00:00:00.172687 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.172662 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90a6dc00-7da1-4523-ba9c-8d66c7704998-serviceca\") pod \"image-pruner-29617920-629pq\" (UID: \"90a6dc00-7da1-4523-ba9c-8d66c7704998\") " pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:00.172813 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.172716 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xln\" (UniqueName: \"kubernetes.io/projected/90a6dc00-7da1-4523-ba9c-8d66c7704998-kube-api-access-h5xln\") pod \"image-pruner-29617920-629pq\" (UID: \"90a6dc00-7da1-4523-ba9c-8d66c7704998\") " pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:00.273681 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.273643 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90a6dc00-7da1-4523-ba9c-8d66c7704998-serviceca\") pod \"image-pruner-29617920-629pq\" (UID: \"90a6dc00-7da1-4523-ba9c-8d66c7704998\") " pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:00.273879 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.273701 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xln\" (UniqueName: \"kubernetes.io/projected/90a6dc00-7da1-4523-ba9c-8d66c7704998-kube-api-access-h5xln\") pod \"image-pruner-29617920-629pq\" (UID: \"90a6dc00-7da1-4523-ba9c-8d66c7704998\") " pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:00.274303 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.274276 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90a6dc00-7da1-4523-ba9c-8d66c7704998-serviceca\") pod \"image-pruner-29617920-629pq\" (UID: \"90a6dc00-7da1-4523-ba9c-8d66c7704998\") " pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:00.282910 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.282882 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xln\" (UniqueName: \"kubernetes.io/projected/90a6dc00-7da1-4523-ba9c-8d66c7704998-kube-api-access-h5xln\") pod \"image-pruner-29617920-629pq\" (UID: \"90a6dc00-7da1-4523-ba9c-8d66c7704998\") " pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:00.489055 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.489025 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:00.615832 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.615809 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29617920-629pq"] Apr 25 00:00:00.618393 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:00:00.618363 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90a6dc00_7da1_4523_ba9c_8d66c7704998.slice/crio-7ebd823ea72a048b4f8b80e24ac6d052f0a7c13f6019295c4b6b3f8e2be003d7 WatchSource:0}: Error finding container 7ebd823ea72a048b4f8b80e24ac6d052f0a7c13f6019295c4b6b3f8e2be003d7: Status 404 returned error can't find the container with id 7ebd823ea72a048b4f8b80e24ac6d052f0a7c13f6019295c4b6b3f8e2be003d7 Apr 25 00:00:00.731546 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.731464 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99124769-77cb-4b30-9c98-f9c6be46a1cb" path="/var/lib/kubelet/pods/99124769-77cb-4b30-9c98-f9c6be46a1cb/volumes" Apr 25 00:00:00.906143 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.906104 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-629pq" event={"ID":"90a6dc00-7da1-4523-ba9c-8d66c7704998","Type":"ContainerStarted","Data":"9bb2bf4a255f841b27a26e0d6775a77f383b7c10cd90e9a250de4f27411e527f"} Apr 25 00:00:00.906143 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.906141 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-629pq" event={"ID":"90a6dc00-7da1-4523-ba9c-8d66c7704998","Type":"ContainerStarted","Data":"7ebd823ea72a048b4f8b80e24ac6d052f0a7c13f6019295c4b6b3f8e2be003d7"} Apr 25 00:00:00.927825 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:00.927777 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29617920-629pq" podStartSLOduration=0.927762596 podStartE2EDuration="927.762596ms" podCreationTimestamp="2026-04-25 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:00:00.925050363 +0000 UTC m=+386.789542653" watchObservedRunningTime="2026-04-25 00:00:00.927762596 +0000 UTC m=+386.792254884" Apr 25 00:00:01.910323 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:01.910283 2566 generic.go:358] "Generic (PLEG): container finished" podID="90a6dc00-7da1-4523-ba9c-8d66c7704998" containerID="9bb2bf4a255f841b27a26e0d6775a77f383b7c10cd90e9a250de4f27411e527f" exitCode=0 Apr 25 00:00:01.910543 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:01.910382 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-629pq" event={"ID":"90a6dc00-7da1-4523-ba9c-8d66c7704998","Type":"ContainerDied","Data":"9bb2bf4a255f841b27a26e0d6775a77f383b7c10cd90e9a250de4f27411e527f"} Apr 25 00:00:03.039497 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.039467 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:03.096495 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.096455 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5xln\" (UniqueName: \"kubernetes.io/projected/90a6dc00-7da1-4523-ba9c-8d66c7704998-kube-api-access-h5xln\") pod \"90a6dc00-7da1-4523-ba9c-8d66c7704998\" (UID: \"90a6dc00-7da1-4523-ba9c-8d66c7704998\") " Apr 25 00:00:03.096495 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.096506 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90a6dc00-7da1-4523-ba9c-8d66c7704998-serviceca\") pod \"90a6dc00-7da1-4523-ba9c-8d66c7704998\" (UID: \"90a6dc00-7da1-4523-ba9c-8d66c7704998\") " Apr 25 00:00:03.096866 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.096843 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90a6dc00-7da1-4523-ba9c-8d66c7704998-serviceca" (OuterVolumeSpecName: "serviceca") pod "90a6dc00-7da1-4523-ba9c-8d66c7704998" (UID: "90a6dc00-7da1-4523-ba9c-8d66c7704998"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:03.098663 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.098642 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a6dc00-7da1-4523-ba9c-8d66c7704998-kube-api-access-h5xln" (OuterVolumeSpecName: "kube-api-access-h5xln") pod "90a6dc00-7da1-4523-ba9c-8d66c7704998" (UID: "90a6dc00-7da1-4523-ba9c-8d66c7704998"). InnerVolumeSpecName "kube-api-access-h5xln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:03.197875 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.197837 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5xln\" (UniqueName: \"kubernetes.io/projected/90a6dc00-7da1-4523-ba9c-8d66c7704998-kube-api-access-h5xln\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:00:03.197875 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.197867 2566 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90a6dc00-7da1-4523-ba9c-8d66c7704998-serviceca\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:00:03.916805 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.916772 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-629pq" Apr 25 00:00:03.916991 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.916770 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-629pq" event={"ID":"90a6dc00-7da1-4523-ba9c-8d66c7704998","Type":"ContainerDied","Data":"7ebd823ea72a048b4f8b80e24ac6d052f0a7c13f6019295c4b6b3f8e2be003d7"} Apr 25 00:00:03.916991 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:03.916888 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ebd823ea72a048b4f8b80e24ac6d052f0a7c13f6019295c4b6b3f8e2be003d7" Apr 25 00:00:09.346420 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.346380 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj"] Apr 25 00:00:09.346919 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.346711 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90a6dc00-7da1-4523-ba9c-8d66c7704998" containerName="image-pruner" Apr 25 00:00:09.346919 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.346723 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a6dc00-7da1-4523-ba9c-8d66c7704998" containerName="image-pruner" Apr 25 00:00:09.346919 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.346808 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="90a6dc00-7da1-4523-ba9c-8d66c7704998" containerName="image-pruner" Apr 25 00:00:09.350152 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.350135 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.352754 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.352732 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 25 00:00:09.352913 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.352742 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6l59n\"" Apr 25 00:00:09.353799 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.353782 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 25 00:00:09.357757 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.357735 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj"] Apr 25 00:00:09.447926 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.447890 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.447926 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.447928 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.448169 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.447951 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsllj\" (UniqueName: \"kubernetes.io/projected/57452a6f-3151-47ad-bdb7-8d9588da2546-kube-api-access-hsllj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.549408 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.549375 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.549408 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.549411 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.549615 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.549433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsllj\" (UniqueName: \"kubernetes.io/projected/57452a6f-3151-47ad-bdb7-8d9588da2546-kube-api-access-hsllj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.549860 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.549838 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.549896 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.549847 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.559410 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.559386 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsllj\" (UniqueName: \"kubernetes.io/projected/57452a6f-3151-47ad-bdb7-8d9588da2546-kube-api-access-hsllj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.661004 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.660918 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:09.780764 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.780734 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj"] Apr 25 00:00:09.784497 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:00:09.784469 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57452a6f_3151_47ad_bdb7_8d9588da2546.slice/crio-fe893fe1ac80c5d63b9d391df0205995d78cd89800e789a0c8dd97624c47ade7 WatchSource:0}: Error finding container fe893fe1ac80c5d63b9d391df0205995d78cd89800e789a0c8dd97624c47ade7: Status 404 returned error can't find the container with id fe893fe1ac80c5d63b9d391df0205995d78cd89800e789a0c8dd97624c47ade7 Apr 25 00:00:09.939093 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:09.939061 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" event={"ID":"57452a6f-3151-47ad-bdb7-8d9588da2546","Type":"ContainerStarted","Data":"fe893fe1ac80c5d63b9d391df0205995d78cd89800e789a0c8dd97624c47ade7"} Apr 25 00:00:15.961611 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:15.961558 2566 generic.go:358] "Generic (PLEG): container finished" podID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerID="230747c1f1f63b5696ec6446ea2736bd3e368a9145a828cd70113ff9ce571537" exitCode=0 Apr 25 00:00:15.961970 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:15.961663 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" event={"ID":"57452a6f-3151-47ad-bdb7-8d9588da2546","Type":"ContainerDied","Data":"230747c1f1f63b5696ec6446ea2736bd3e368a9145a828cd70113ff9ce571537"} Apr 25 00:00:21.981904 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:21.981871 2566 generic.go:358] "Generic (PLEG): container finished" podID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerID="bcb1f593465cf2ff006f9d02e0e70b28353aae23775dfd0fdba19450ad620574" exitCode=0 Apr 25 00:00:21.982304 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:21.981940 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" event={"ID":"57452a6f-3151-47ad-bdb7-8d9588da2546","Type":"ContainerDied","Data":"bcb1f593465cf2ff006f9d02e0e70b28353aae23775dfd0fdba19450ad620574"} Apr 25 00:00:29.006080 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:29.006044 2566 generic.go:358] "Generic (PLEG): container finished" podID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerID="202ab978f57ca83f623663edb535bfbaaecfd506b2e0d1f9f16d936ac4b136be" exitCode=0 Apr 25 00:00:29.006484 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:29.006137 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" event={"ID":"57452a6f-3151-47ad-bdb7-8d9588da2546","Type":"ContainerDied","Data":"202ab978f57ca83f623663edb535bfbaaecfd506b2e0d1f9f16d936ac4b136be"} Apr 25 00:00:30.129403 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.129381 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:30.228254 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.228214 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-util\") pod \"57452a6f-3151-47ad-bdb7-8d9588da2546\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " Apr 25 00:00:30.228441 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.228276 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-bundle\") pod \"57452a6f-3151-47ad-bdb7-8d9588da2546\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " Apr 25 00:00:30.228441 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.228311 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsllj\" (UniqueName: \"kubernetes.io/projected/57452a6f-3151-47ad-bdb7-8d9588da2546-kube-api-access-hsllj\") pod \"57452a6f-3151-47ad-bdb7-8d9588da2546\" (UID: \"57452a6f-3151-47ad-bdb7-8d9588da2546\") " Apr 25 00:00:30.228995 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.228967 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-bundle" (OuterVolumeSpecName: "bundle") pod "57452a6f-3151-47ad-bdb7-8d9588da2546" (UID: "57452a6f-3151-47ad-bdb7-8d9588da2546"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:00:30.230472 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.230450 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57452a6f-3151-47ad-bdb7-8d9588da2546-kube-api-access-hsllj" (OuterVolumeSpecName: "kube-api-access-hsllj") pod "57452a6f-3151-47ad-bdb7-8d9588da2546" (UID: "57452a6f-3151-47ad-bdb7-8d9588da2546"). InnerVolumeSpecName "kube-api-access-hsllj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:30.233868 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.233844 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-util" (OuterVolumeSpecName: "util") pod "57452a6f-3151-47ad-bdb7-8d9588da2546" (UID: "57452a6f-3151-47ad-bdb7-8d9588da2546"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:00:30.329852 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.329772 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-util\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:00:30.329852 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.329802 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57452a6f-3151-47ad-bdb7-8d9588da2546-bundle\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:00:30.329852 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:30.329812 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hsllj\" (UniqueName: \"kubernetes.io/projected/57452a6f-3151-47ad-bdb7-8d9588da2546-kube-api-access-hsllj\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:00:30.760093 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:30.760064 2566 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57452a6f_3151_47ad_bdb7_8d9588da2546.slice\": RecentStats: unable to find data in memory cache]" Apr 25 00:00:31.013659 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:31.013558 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" event={"ID":"57452a6f-3151-47ad-bdb7-8d9588da2546","Type":"ContainerDied","Data":"fe893fe1ac80c5d63b9d391df0205995d78cd89800e789a0c8dd97624c47ade7"} Apr 25 00:00:31.013659 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:31.013620 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe893fe1ac80c5d63b9d391df0205995d78cd89800e789a0c8dd97624c47ade7" Apr 25 00:00:31.013659 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:31.013587 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjd5fj" Apr 25 00:00:36.252706 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.252667 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j"] Apr 25 00:00:36.253101 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.253006 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerName="util" Apr 25 00:00:36.253101 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.253018 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerName="util" Apr 25 00:00:36.253101 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.253033 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerName="pull" Apr 25 00:00:36.253101 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.253042 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerName="pull" Apr 25 00:00:36.253101 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.253053 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerName="extract" Apr 25 00:00:36.253101 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.253059 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerName="extract" Apr 25 00:00:36.253282 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.253105 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="57452a6f-3151-47ad-bdb7-8d9588da2546" containerName="extract" Apr 25 00:00:36.260153 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.260131 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:36.263551 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.263533 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-zsznr\"" Apr 25 00:00:36.264173 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.264151 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 25 00:00:36.264249 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.264189 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 25 00:00:36.264439 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.264425 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 25 00:00:36.272357 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.272336 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j"] Apr 25 00:00:36.376847 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.376815 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3bec49a8-efe2-4a28-8b4c-18d835559d09-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nz22j\" (UID: \"3bec49a8-efe2-4a28-8b4c-18d835559d09\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:36.377016 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.376861 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cvd4\" (UniqueName: \"kubernetes.io/projected/3bec49a8-efe2-4a28-8b4c-18d835559d09-kube-api-access-9cvd4\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nz22j\" (UID: \"3bec49a8-efe2-4a28-8b4c-18d835559d09\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:36.477817 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.477783 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3bec49a8-efe2-4a28-8b4c-18d835559d09-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nz22j\" (UID: \"3bec49a8-efe2-4a28-8b4c-18d835559d09\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:36.477973 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.477835 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cvd4\" (UniqueName: \"kubernetes.io/projected/3bec49a8-efe2-4a28-8b4c-18d835559d09-kube-api-access-9cvd4\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nz22j\" (UID: \"3bec49a8-efe2-4a28-8b4c-18d835559d09\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:36.480190 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.480160 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3bec49a8-efe2-4a28-8b4c-18d835559d09-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nz22j\" (UID: \"3bec49a8-efe2-4a28-8b4c-18d835559d09\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:36.486483 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.486450 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cvd4\" (UniqueName: \"kubernetes.io/projected/3bec49a8-efe2-4a28-8b4c-18d835559d09-kube-api-access-9cvd4\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nz22j\" (UID: \"3bec49a8-efe2-4a28-8b4c-18d835559d09\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:36.570743 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.570642 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:36.706746 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:36.706627 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j"] Apr 25 00:00:36.710135 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:00:36.710095 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bec49a8_efe2_4a28_8b4c_18d835559d09.slice/crio-1d5a75a30f61c5ca26af6a22720919a69690903c5b65f4321065af6495cd6b58 WatchSource:0}: Error finding container 1d5a75a30f61c5ca26af6a22720919a69690903c5b65f4321065af6495cd6b58: Status 404 returned error can't find the container with id 1d5a75a30f61c5ca26af6a22720919a69690903c5b65f4321065af6495cd6b58 Apr 25 00:00:37.032301 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:37.032261 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" event={"ID":"3bec49a8-efe2-4a28-8b4c-18d835559d09","Type":"ContainerStarted","Data":"1d5a75a30f61c5ca26af6a22720919a69690903c5b65f4321065af6495cd6b58"} Apr 25 00:00:43.053417 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.053375 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" event={"ID":"3bec49a8-efe2-4a28-8b4c-18d835559d09","Type":"ContainerStarted","Data":"be1ab5e8c3a6cc360241ee24103c838a0eb1aa1872280b4522b509183fe17d7d"} Apr 25 00:00:43.053800 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.053503 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:00:43.089725 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.089610 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" podStartSLOduration=0.98949409 podStartE2EDuration="7.089593471s" podCreationTimestamp="2026-04-25 00:00:36 +0000 UTC" firstStartedPulling="2026-04-25 00:00:36.712538743 +0000 UTC m=+422.577031009" lastFinishedPulling="2026-04-25 00:00:42.812638124 +0000 UTC m=+428.677130390" observedRunningTime="2026-04-25 00:00:43.087023376 +0000 UTC m=+428.951515664" watchObservedRunningTime="2026-04-25 00:00:43.089593471 +0000 UTC m=+428.954085759" Apr 25 00:00:43.366495 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.366415 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9sp87"] Apr 25 00:00:43.370381 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.370358 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.374425 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.374402 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 25 00:00:43.374691 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.374675 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 25 00:00:43.374972 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.374954 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-ljd5x\"" Apr 25 00:00:43.378339 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.378317 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9sp87"] Apr 25 00:00:43.538150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.538112 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-cabundle0\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.538352 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.538182 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.538352 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.538229 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8nk4\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-kube-api-access-m8nk4\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.633544 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.633471 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5"] Apr 25 00:00:43.636790 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.636772 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:43.639311 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.639283 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.639446 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.639364 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8nk4\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-kube-api-access-m8nk4\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.639446 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.639403 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-cabundle0\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.639446 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:43.639428 2566 secret.go:281] references non-existent secret key: ca.crt Apr 25 00:00:43.639446 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:43.639444 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 25 00:00:43.639655 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:43.639455 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9sp87: references non-existent secret key: ca.crt Apr 25 00:00:43.639655 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:43.639511 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates podName:01ba653a-1ec6-404c-9cf5-74d8b17ccdd5 nodeName:}" failed. No retries permitted until 2026-04-25 00:00:44.139491653 +0000 UTC m=+430.003983921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates") pod "keda-operator-ffbb595cb-9sp87" (UID: "01ba653a-1ec6-404c-9cf5-74d8b17ccdd5") : references non-existent secret key: ca.crt Apr 25 00:00:43.639957 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.639915 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 25 00:00:43.640079 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.640058 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-cabundle0\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.645074 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.645045 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5"] Apr 25 00:00:43.651357 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.651331 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8nk4\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-kube-api-access-m8nk4\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:43.740141 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.740097 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2qlc\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-kube-api-access-h2qlc\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:43.740141 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.740141 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ba8130e5-c331-4104-85e3-62cb73c068c8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:43.740399 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.740163 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:43.841560 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.841523 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2qlc\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-kube-api-access-h2qlc\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:43.841757 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.841594 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ba8130e5-c331-4104-85e3-62cb73c068c8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:43.841757 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.841618 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:43.841845 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:43.841792 2566 secret.go:281] references non-existent secret key: tls.crt Apr 25 00:00:43.841845 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:43.841812 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 25 00:00:43.841845 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:43.841836 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5: references non-existent secret key: tls.crt Apr 25 00:00:43.841934 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:43.841893 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates podName:ba8130e5-c331-4104-85e3-62cb73c068c8 nodeName:}" failed. No retries permitted until 2026-04-25 00:00:44.34187507 +0000 UTC m=+430.206367354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates") pod "keda-metrics-apiserver-7c9f485588-5zkb5" (UID: "ba8130e5-c331-4104-85e3-62cb73c068c8") : references non-existent secret key: tls.crt Apr 25 00:00:43.841982 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.841970 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ba8130e5-c331-4104-85e3-62cb73c068c8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:43.850046 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:43.850016 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2qlc\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-kube-api-access-h2qlc\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:44.144296 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:44.144254 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:44.145000 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:44.144978 2566 secret.go:281] references non-existent secret key: ca.crt Apr 25 00:00:44.145112 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:44.145101 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 25 00:00:44.145186 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:44.145178 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9sp87: references non-existent secret key: ca.crt Apr 25 00:00:44.145304 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:44.145295 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates podName:01ba653a-1ec6-404c-9cf5-74d8b17ccdd5 nodeName:}" failed. No retries permitted until 2026-04-25 00:00:45.145274753 +0000 UTC m=+431.009767021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates") pod "keda-operator-ffbb595cb-9sp87" (UID: "01ba653a-1ec6-404c-9cf5-74d8b17ccdd5") : references non-existent secret key: ca.crt Apr 25 00:00:44.347002 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:44.346959 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:44.347172 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:44.347142 2566 secret.go:281] references non-existent secret key: tls.crt Apr 25 00:00:44.347172 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:44.347168 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 25 00:00:44.347244 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:44.347197 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5: references non-existent secret key: tls.crt Apr 25 00:00:44.347278 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:44.347266 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates podName:ba8130e5-c331-4104-85e3-62cb73c068c8 nodeName:}" failed. No retries permitted until 2026-04-25 00:00:45.347246814 +0000 UTC m=+431.211739085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates") pod "keda-metrics-apiserver-7c9f485588-5zkb5" (UID: "ba8130e5-c331-4104-85e3-62cb73c068c8") : references non-existent secret key: tls.crt Apr 25 00:00:45.153135 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:45.153098 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:45.153504 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:45.153236 2566 secret.go:281] references non-existent secret key: ca.crt Apr 25 00:00:45.153504 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:45.153256 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 25 00:00:45.153504 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:45.153265 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9sp87: references non-existent secret key: ca.crt Apr 25 00:00:45.153504 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:45.153328 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates podName:01ba653a-1ec6-404c-9cf5-74d8b17ccdd5 nodeName:}" failed. No retries permitted until 2026-04-25 00:00:47.15331361 +0000 UTC m=+433.017805878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates") pod "keda-operator-ffbb595cb-9sp87" (UID: "01ba653a-1ec6-404c-9cf5-74d8b17ccdd5") : references non-existent secret key: ca.crt Apr 25 00:00:45.355825 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:45.355788 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:45.356011 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:45.355845 2566 secret.go:281] references non-existent secret key: tls.crt Apr 25 00:00:45.356011 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:45.355871 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 25 00:00:45.356011 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:45.355895 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5: references non-existent secret key: tls.crt Apr 25 00:00:45.356011 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:00:45.355958 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates podName:ba8130e5-c331-4104-85e3-62cb73c068c8 nodeName:}" failed. No retries permitted until 2026-04-25 00:00:47.355938964 +0000 UTC m=+433.220431245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates") pod "keda-metrics-apiserver-7c9f485588-5zkb5" (UID: "ba8130e5-c331-4104-85e3-62cb73c068c8") : references non-existent secret key: tls.crt Apr 25 00:00:47.172798 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:47.172761 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:47.175091 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:47.175069 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01ba653a-1ec6-404c-9cf5-74d8b17ccdd5-certificates\") pod \"keda-operator-ffbb595cb-9sp87\" (UID: \"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5\") " pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:47.281719 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:47.281686 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:47.374411 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:47.374371 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:47.376900 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:47.376861 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ba8130e5-c331-4104-85e3-62cb73c068c8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkb5\" (UID: \"ba8130e5-c331-4104-85e3-62cb73c068c8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:47.398420 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:47.398396 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9sp87"] Apr 25 00:00:47.400066 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:00:47.400042 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ba653a_1ec6_404c_9cf5_74d8b17ccdd5.slice/crio-62bbd3ba371d4ba834ac42cdb1bcba3edba086eb1a070cd9ce5b3821870d0f24 WatchSource:0}: Error finding container 62bbd3ba371d4ba834ac42cdb1bcba3edba086eb1a070cd9ce5b3821870d0f24: Status 404 returned error can't find the container with id 62bbd3ba371d4ba834ac42cdb1bcba3edba086eb1a070cd9ce5b3821870d0f24 Apr 25 00:00:47.550684 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:47.550655 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:47.665168 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:47.665142 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5"] Apr 25 00:00:47.667509 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:00:47.667476 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba8130e5_c331_4104_85e3_62cb73c068c8.slice/crio-dd84949c3c9a365f33d509731e1fa0c479db8b555ec3e1ccfb26157ac872f81e WatchSource:0}: Error finding container dd84949c3c9a365f33d509731e1fa0c479db8b555ec3e1ccfb26157ac872f81e: Status 404 returned error can't find the container with id dd84949c3c9a365f33d509731e1fa0c479db8b555ec3e1ccfb26157ac872f81e Apr 25 00:00:48.071062 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:48.071023 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" event={"ID":"ba8130e5-c331-4104-85e3-62cb73c068c8","Type":"ContainerStarted","Data":"dd84949c3c9a365f33d509731e1fa0c479db8b555ec3e1ccfb26157ac872f81e"} Apr 25 00:00:48.071948 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:48.071923 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-9sp87" event={"ID":"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5","Type":"ContainerStarted","Data":"62bbd3ba371d4ba834ac42cdb1bcba3edba086eb1a070cd9ce5b3821870d0f24"} Apr 25 00:00:52.088626 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:52.088585 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-9sp87" event={"ID":"01ba653a-1ec6-404c-9cf5-74d8b17ccdd5","Type":"ContainerStarted","Data":"c3e0da1a142c4ea5a36622b54ad02b9eddb4f11fd137b15c31e3f2bfaaf95909"} Apr 25 00:00:52.089013 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:52.088663 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:00:52.105747 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:52.105657 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-9sp87" podStartSLOduration=4.571861748 podStartE2EDuration="9.105643588s" podCreationTimestamp="2026-04-25 00:00:43 +0000 UTC" firstStartedPulling="2026-04-25 00:00:47.401273677 +0000 UTC m=+433.265765943" lastFinishedPulling="2026-04-25 00:00:51.935055518 +0000 UTC m=+437.799547783" observedRunningTime="2026-04-25 00:00:52.104181395 +0000 UTC m=+437.968673673" watchObservedRunningTime="2026-04-25 00:00:52.105643588 +0000 UTC m=+437.970135875" Apr 25 00:00:54.097051 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:54.097016 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" event={"ID":"ba8130e5-c331-4104-85e3-62cb73c068c8","Type":"ContainerStarted","Data":"1ca943194dcb6dc5b8c8d7bbfce80594910f1539eec16fcaedf78d553e6c231b"} Apr 25 00:00:54.097499 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:54.097141 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:00:54.118502 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:00:54.118452 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" podStartSLOduration=5.481577381 podStartE2EDuration="11.118439729s" podCreationTimestamp="2026-04-25 00:00:43 +0000 UTC" firstStartedPulling="2026-04-25 00:00:47.66886156 +0000 UTC m=+433.533353826" lastFinishedPulling="2026-04-25 00:00:53.305723905 +0000 UTC m=+439.170216174" observedRunningTime="2026-04-25 00:00:54.118051835 +0000 UTC m=+439.982544124" watchObservedRunningTime="2026-04-25 00:00:54.118439729 +0000 UTC m=+439.982932016" Apr 25 00:01:04.059636 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:04.059599 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nz22j" Apr 25 00:01:05.106231 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:05.106200 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkb5" Apr 25 00:01:13.094166 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:13.094135 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-9sp87" Apr 25 00:01:50.434619 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.434583 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-4n8hc"] Apr 25 00:01:50.441967 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.441939 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7"] Apr 25 00:01:50.442083 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.442000 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:50.444758 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.444719 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 25 00:01:50.445845 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.445826 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 25 00:01:50.445962 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.445844 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-5q8tf\"" Apr 25 00:01:50.445962 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.445828 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 25 00:01:50.446466 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.446447 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:50.446812 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.446794 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-4n8hc"] Apr 25 00:01:50.449261 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.449241 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 25 00:01:50.449506 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.449491 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-nshtp\"" Apr 25 00:01:50.457690 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.457666 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7"] Apr 25 00:01:50.512844 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.512811 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rchxs\" (UniqueName: \"kubernetes.io/projected/17989510-c022-447b-8cc2-94a511145bc5-kube-api-access-rchxs\") pod \"llmisvc-controller-manager-68cc5db7c4-ngfw7\" (UID: \"17989510-c022-447b-8cc2-94a511145bc5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:50.513031 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.512866 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-cert\") pod \"kserve-controller-manager-64c4d9588d-4n8hc\" (UID: \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\") " pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:50.513031 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.512959 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17989510-c022-447b-8cc2-94a511145bc5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ngfw7\" (UID: \"17989510-c022-447b-8cc2-94a511145bc5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:50.513031 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.512987 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmvhj\" (UniqueName: \"kubernetes.io/projected/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-kube-api-access-hmvhj\") pod \"kserve-controller-manager-64c4d9588d-4n8hc\" (UID: \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\") " pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:50.613476 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.613433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-cert\") pod \"kserve-controller-manager-64c4d9588d-4n8hc\" (UID: \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\") " pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:50.613685 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.613497 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17989510-c022-447b-8cc2-94a511145bc5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ngfw7\" (UID: \"17989510-c022-447b-8cc2-94a511145bc5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:50.613685 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.613523 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmvhj\" (UniqueName: \"kubernetes.io/projected/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-kube-api-access-hmvhj\") pod \"kserve-controller-manager-64c4d9588d-4n8hc\" (UID: \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\") " pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:50.613685 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.613595 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rchxs\" (UniqueName: \"kubernetes.io/projected/17989510-c022-447b-8cc2-94a511145bc5-kube-api-access-rchxs\") pod \"llmisvc-controller-manager-68cc5db7c4-ngfw7\" (UID: \"17989510-c022-447b-8cc2-94a511145bc5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:50.616102 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.616081 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17989510-c022-447b-8cc2-94a511145bc5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ngfw7\" (UID: \"17989510-c022-447b-8cc2-94a511145bc5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:50.616166 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.616081 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-cert\") pod \"kserve-controller-manager-64c4d9588d-4n8hc\" (UID: \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\") " pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:50.621524 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.621496 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmvhj\" (UniqueName: \"kubernetes.io/projected/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-kube-api-access-hmvhj\") pod \"kserve-controller-manager-64c4d9588d-4n8hc\" (UID: \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\") " pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:50.621704 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.621686 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rchxs\" (UniqueName: \"kubernetes.io/projected/17989510-c022-447b-8cc2-94a511145bc5-kube-api-access-rchxs\") pod \"llmisvc-controller-manager-68cc5db7c4-ngfw7\" (UID: \"17989510-c022-447b-8cc2-94a511145bc5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:50.757000 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.756925 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:50.764702 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.764675 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:50.893105 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.893033 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-4n8hc"] Apr 25 00:01:50.895619 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:01:50.895588 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c4c9b48_db0d_42ad_8a4a_25704aa5cc59.slice/crio-7b833556f8da6463c3ed13a7137699830d717e6bd965d50166eecad556b474ea WatchSource:0}: Error finding container 7b833556f8da6463c3ed13a7137699830d717e6bd965d50166eecad556b474ea: Status 404 returned error can't find the container with id 7b833556f8da6463c3ed13a7137699830d717e6bd965d50166eecad556b474ea Apr 25 00:01:50.913691 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:50.913667 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7"] Apr 25 00:01:50.915187 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:01:50.915162 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod17989510_c022_447b_8cc2_94a511145bc5.slice/crio-44833e597daeeeb29fd8a858492189417700671b3303307394fbed0c721da456 WatchSource:0}: Error finding container 44833e597daeeeb29fd8a858492189417700671b3303307394fbed0c721da456: Status 404 returned error can't find the container with id 44833e597daeeeb29fd8a858492189417700671b3303307394fbed0c721da456 Apr 25 00:01:51.285209 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:51.285166 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" event={"ID":"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59","Type":"ContainerStarted","Data":"7b833556f8da6463c3ed13a7137699830d717e6bd965d50166eecad556b474ea"} Apr 25 00:01:51.286081 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:51.286058 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" event={"ID":"17989510-c022-447b-8cc2-94a511145bc5","Type":"ContainerStarted","Data":"44833e597daeeeb29fd8a858492189417700671b3303307394fbed0c721da456"} Apr 25 00:01:54.298699 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:54.298662 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" event={"ID":"17989510-c022-447b-8cc2-94a511145bc5","Type":"ContainerStarted","Data":"63abd393d5b97966eb26de4e8e39bc1c31992a9e52ee9cd3e6315d2e4639c9d5"} Apr 25 00:01:54.299133 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:54.298759 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:01:54.300209 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:54.300186 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" event={"ID":"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59","Type":"ContainerStarted","Data":"1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace"} Apr 25 00:01:54.300347 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:54.300335 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:01:54.314520 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:54.314476 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" podStartSLOduration=1.075713921 podStartE2EDuration="4.31445378s" podCreationTimestamp="2026-04-25 00:01:50 +0000 UTC" firstStartedPulling="2026-04-25 00:01:50.916384344 +0000 UTC m=+496.780876610" lastFinishedPulling="2026-04-25 00:01:54.155124199 +0000 UTC m=+500.019616469" observedRunningTime="2026-04-25 00:01:54.313886749 +0000 UTC m=+500.178379037" watchObservedRunningTime="2026-04-25 00:01:54.31445378 +0000 UTC m=+500.178946068" Apr 25 00:01:54.329239 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:01:54.329193 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" podStartSLOduration=1.029092326 podStartE2EDuration="4.329178676s" podCreationTimestamp="2026-04-25 00:01:50 +0000 UTC" firstStartedPulling="2026-04-25 00:01:50.89704666 +0000 UTC m=+496.761538933" lastFinishedPulling="2026-04-25 00:01:54.197133002 +0000 UTC m=+500.061625283" observedRunningTime="2026-04-25 00:01:54.328247855 +0000 UTC m=+500.192740144" watchObservedRunningTime="2026-04-25 00:01:54.329178676 +0000 UTC m=+500.193670964" Apr 25 00:02:25.306769 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:25.306680 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngfw7" Apr 25 00:02:25.309664 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:25.309643 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:02:26.725622 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.725586 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-4n8hc"] Apr 25 00:02:26.726005 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.725794 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" podUID="8c4c9b48-db0d-42ad-8a4a-25704aa5cc59" containerName="manager" containerID="cri-o://1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace" gracePeriod=10 Apr 25 00:02:26.751150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.751118 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-ggrlp"] Apr 25 00:02:26.757478 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.756083 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:26.763505 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.763473 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-ggrlp"] Apr 25 00:02:26.827697 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.827660 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9vp\" (UniqueName: \"kubernetes.io/projected/25fe2e2f-9a36-4a32-940e-d57e6113c6a6-kube-api-access-zs9vp\") pod \"kserve-controller-manager-64c4d9588d-ggrlp\" (UID: \"25fe2e2f-9a36-4a32-940e-d57e6113c6a6\") " pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:26.827834 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.827776 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25fe2e2f-9a36-4a32-940e-d57e6113c6a6-cert\") pod \"kserve-controller-manager-64c4d9588d-ggrlp\" (UID: \"25fe2e2f-9a36-4a32-940e-d57e6113c6a6\") " pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:26.929125 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.929090 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25fe2e2f-9a36-4a32-940e-d57e6113c6a6-cert\") pod \"kserve-controller-manager-64c4d9588d-ggrlp\" (UID: \"25fe2e2f-9a36-4a32-940e-d57e6113c6a6\") " pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:26.929293 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.929145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9vp\" (UniqueName: \"kubernetes.io/projected/25fe2e2f-9a36-4a32-940e-d57e6113c6a6-kube-api-access-zs9vp\") pod \"kserve-controller-manager-64c4d9588d-ggrlp\" (UID: \"25fe2e2f-9a36-4a32-940e-d57e6113c6a6\") " pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:26.931540 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.931518 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25fe2e2f-9a36-4a32-940e-d57e6113c6a6-cert\") pod \"kserve-controller-manager-64c4d9588d-ggrlp\" (UID: \"25fe2e2f-9a36-4a32-940e-d57e6113c6a6\") " pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:26.938583 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.938537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9vp\" (UniqueName: \"kubernetes.io/projected/25fe2e2f-9a36-4a32-940e-d57e6113c6a6-kube-api-access-zs9vp\") pod \"kserve-controller-manager-64c4d9588d-ggrlp\" (UID: \"25fe2e2f-9a36-4a32-940e-d57e6113c6a6\") " pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:26.956735 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:26.956712 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:02:27.030357 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.030277 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-cert\") pod \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\" (UID: \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\") " Apr 25 00:02:27.030357 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.030351 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmvhj\" (UniqueName: \"kubernetes.io/projected/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-kube-api-access-hmvhj\") pod \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\" (UID: \"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59\") " Apr 25 00:02:27.032455 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.032423 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-kube-api-access-hmvhj" (OuterVolumeSpecName: "kube-api-access-hmvhj") pod "8c4c9b48-db0d-42ad-8a4a-25704aa5cc59" (UID: "8c4c9b48-db0d-42ad-8a4a-25704aa5cc59"). InnerVolumeSpecName "kube-api-access-hmvhj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:02:27.032455 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.032423 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-cert" (OuterVolumeSpecName: "cert") pod "8c4c9b48-db0d-42ad-8a4a-25704aa5cc59" (UID: "8c4c9b48-db0d-42ad-8a4a-25704aa5cc59"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:02:27.110542 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.110500 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:27.131247 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.131221 2566 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:02:27.131247 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.131249 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hmvhj\" (UniqueName: \"kubernetes.io/projected/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59-kube-api-access-hmvhj\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:02:27.232424 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.232394 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-ggrlp"] Apr 25 00:02:27.233928 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:02:27.233896 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25fe2e2f_9a36_4a32_940e_d57e6113c6a6.slice/crio-2f3393f76176aac5b15d00b6e9d4fdcf89b15a99ffa64cab27155fcbf2de05b1 WatchSource:0}: Error finding container 2f3393f76176aac5b15d00b6e9d4fdcf89b15a99ffa64cab27155fcbf2de05b1: Status 404 returned error can't find the container with id 2f3393f76176aac5b15d00b6e9d4fdcf89b15a99ffa64cab27155fcbf2de05b1 Apr 25 00:02:27.411988 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.411898 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" event={"ID":"25fe2e2f-9a36-4a32-940e-d57e6113c6a6","Type":"ContainerStarted","Data":"2f3393f76176aac5b15d00b6e9d4fdcf89b15a99ffa64cab27155fcbf2de05b1"} Apr 25 00:02:27.412998 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.412964 2566 generic.go:358] "Generic (PLEG): container finished" podID="8c4c9b48-db0d-42ad-8a4a-25704aa5cc59" containerID="1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace" exitCode=0 Apr 25 00:02:27.413129 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.413025 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" Apr 25 00:02:27.413129 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.413048 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" event={"ID":"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59","Type":"ContainerDied","Data":"1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace"} Apr 25 00:02:27.413129 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.413083 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-4n8hc" event={"ID":"8c4c9b48-db0d-42ad-8a4a-25704aa5cc59","Type":"ContainerDied","Data":"7b833556f8da6463c3ed13a7137699830d717e6bd965d50166eecad556b474ea"} Apr 25 00:02:27.413129 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.413106 2566 scope.go:117] "RemoveContainer" containerID="1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace" Apr 25 00:02:27.421199 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.421181 2566 scope.go:117] "RemoveContainer" containerID="1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace" Apr 25 00:02:27.421432 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:02:27.421414 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace\": container with ID starting with 1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace not found: ID does not exist" containerID="1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace" Apr 25 00:02:27.421491 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.421438 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace"} err="failed to get container status \"1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace\": rpc error: code = NotFound desc = could not find container \"1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace\": container with ID starting with 1914166d8adcb72f0ef3317259141bc48ee042b7bd945a6dfc213b403badeace not found: ID does not exist" Apr 25 00:02:27.433474 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.433450 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-4n8hc"] Apr 25 00:02:27.439155 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:27.439135 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-4n8hc"] Apr 25 00:02:28.417584 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:28.417541 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" event={"ID":"25fe2e2f-9a36-4a32-940e-d57e6113c6a6","Type":"ContainerStarted","Data":"df9cac54c3126d61c66a7f50dc4248dd23887a5f41da99c4587080bf586b2317"} Apr 25 00:02:28.418023 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:28.417650 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:02:28.435438 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:28.435387 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" podStartSLOduration=2.076927011 podStartE2EDuration="2.435373204s" podCreationTimestamp="2026-04-25 00:02:26 +0000 UTC" firstStartedPulling="2026-04-25 00:02:27.235179971 +0000 UTC m=+533.099672240" lastFinishedPulling="2026-04-25 00:02:27.593626167 +0000 UTC m=+533.458118433" observedRunningTime="2026-04-25 00:02:28.433071276 +0000 UTC m=+534.297563564" watchObservedRunningTime="2026-04-25 00:02:28.435373204 +0000 UTC m=+534.299865491" Apr 25 00:02:28.730653 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:28.730614 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c4c9b48-db0d-42ad-8a4a-25704aa5cc59" path="/var/lib/kubelet/pods/8c4c9b48-db0d-42ad-8a4a-25704aa5cc59/volumes" Apr 25 00:02:59.426003 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:02:59.425974 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-64c4d9588d-ggrlp" Apr 25 00:03:18.999709 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:18.999672 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77b66ccdd8-4cs5q"] Apr 25 00:03:19.000121 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.000006 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c4c9b48-db0d-42ad-8a4a-25704aa5cc59" containerName="manager" Apr 25 00:03:19.000121 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.000016 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4c9b48-db0d-42ad-8a4a-25704aa5cc59" containerName="manager" Apr 25 00:03:19.000121 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.000074 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c4c9b48-db0d-42ad-8a4a-25704aa5cc59" containerName="manager" Apr 25 00:03:19.003130 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.003108 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.011761 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.011741 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b66ccdd8-4cs5q"] Apr 25 00:03:19.088397 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.088365 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-trusted-ca-bundle\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.088558 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.088406 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c60b6ee6-928f-444d-8479-bb2124c901f2-console-oauth-config\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.088558 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.088485 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-oauth-serving-cert\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.088712 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.088590 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-service-ca\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.088712 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.088621 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c60b6ee6-928f-444d-8479-bb2124c901f2-console-serving-cert\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.088712 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.088682 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-console-config\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.088712 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.088699 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l8pd\" (UniqueName: \"kubernetes.io/projected/c60b6ee6-928f-444d-8479-bb2124c901f2-kube-api-access-7l8pd\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.189479 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.189447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-service-ca\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.189479 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.189479 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c60b6ee6-928f-444d-8479-bb2124c901f2-console-serving-cert\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.189740 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.189505 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-console-config\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.189740 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.189522 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l8pd\" (UniqueName: \"kubernetes.io/projected/c60b6ee6-928f-444d-8479-bb2124c901f2-kube-api-access-7l8pd\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.189740 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.189550 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-trusted-ca-bundle\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.189740 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.189598 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c60b6ee6-928f-444d-8479-bb2124c901f2-console-oauth-config\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.189740 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.189651 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-oauth-serving-cert\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.190304 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.190274 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-service-ca\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.190412 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.190328 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-console-config\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.190412 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.190370 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-oauth-serving-cert\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.190604 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.190560 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60b6ee6-928f-444d-8479-bb2124c901f2-trusted-ca-bundle\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.192089 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.192066 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c60b6ee6-928f-444d-8479-bb2124c901f2-console-serving-cert\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.192169 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.192104 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c60b6ee6-928f-444d-8479-bb2124c901f2-console-oauth-config\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.197036 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.197016 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l8pd\" (UniqueName: \"kubernetes.io/projected/c60b6ee6-928f-444d-8479-bb2124c901f2-kube-api-access-7l8pd\") pod \"console-77b66ccdd8-4cs5q\" (UID: \"c60b6ee6-928f-444d-8479-bb2124c901f2\") " pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.312671 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.312609 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:19.433009 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.432974 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b66ccdd8-4cs5q"] Apr 25 00:03:19.436454 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:03:19.436423 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc60b6ee6_928f_444d_8479_bb2124c901f2.slice/crio-01e268ac963c84bff2e706d58582e3325675a1e51b7a65741fef9016c8fc23f5 WatchSource:0}: Error finding container 01e268ac963c84bff2e706d58582e3325675a1e51b7a65741fef9016c8fc23f5: Status 404 returned error can't find the container with id 01e268ac963c84bff2e706d58582e3325675a1e51b7a65741fef9016c8fc23f5 Apr 25 00:03:19.599543 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.599456 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b66ccdd8-4cs5q" event={"ID":"c60b6ee6-928f-444d-8479-bb2124c901f2","Type":"ContainerStarted","Data":"0b271efd7de1d74c5881aa1e395b574d7f276ce801d241432308cf07e65c7577"} Apr 25 00:03:19.599543 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.599502 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b66ccdd8-4cs5q" event={"ID":"c60b6ee6-928f-444d-8479-bb2124c901f2","Type":"ContainerStarted","Data":"01e268ac963c84bff2e706d58582e3325675a1e51b7a65741fef9016c8fc23f5"} Apr 25 00:03:19.620312 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:19.620260 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77b66ccdd8-4cs5q" podStartSLOduration=1.6202426810000001 podStartE2EDuration="1.620242681s" podCreationTimestamp="2026-04-25 00:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:03:19.619985297 +0000 UTC m=+585.484477585" watchObservedRunningTime="2026-04-25 00:03:19.620242681 +0000 UTC m=+585.484734971" Apr 25 00:03:29.313234 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:29.313194 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:29.313653 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:29.313268 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:29.317991 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:29.317968 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:29.637603 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:29.637506 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77b66ccdd8-4cs5q" Apr 25 00:03:29.684629 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:29.684593 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77bfd744f-6nzkj"] Apr 25 00:03:34.644953 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:34.644923 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:03:34.645758 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:34.645735 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:03:54.708395 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:54.708315 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77bfd744f-6nzkj" podUID="3f2e2a06-e786-4f1b-9810-c246aa9459ff" containerName="console" containerID="cri-o://68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05" gracePeriod=15 Apr 25 00:03:54.948889 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:54.948863 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77bfd744f-6nzkj_3f2e2a06-e786-4f1b-9810-c246aa9459ff/console/0.log" Apr 25 00:03:54.949055 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:54.948944 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77bfd744f-6nzkj" Apr 25 00:03:55.003286 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003189 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-config\") pod \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " Apr 25 00:03:55.003286 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003244 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-oauth-config\") pod \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " Apr 25 00:03:55.003286 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003275 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-oauth-serving-cert\") pod \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " Apr 25 00:03:55.003606 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003359 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-serving-cert\") pod \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " Apr 25 00:03:55.003606 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003427 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-trusted-ca-bundle\") pod \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " Apr 25 00:03:55.003606 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003488 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-service-ca\") pod \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " Apr 25 00:03:55.003606 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003512 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68fgj\" (UniqueName: \"kubernetes.io/projected/3f2e2a06-e786-4f1b-9810-c246aa9459ff-kube-api-access-68fgj\") pod \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\" (UID: \"3f2e2a06-e786-4f1b-9810-c246aa9459ff\") " Apr 25 00:03:55.003815 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003642 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-config" (OuterVolumeSpecName: "console-config") pod "3f2e2a06-e786-4f1b-9810-c246aa9459ff" (UID: "3f2e2a06-e786-4f1b-9810-c246aa9459ff"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:55.003815 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003795 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3f2e2a06-e786-4f1b-9810-c246aa9459ff" (UID: "3f2e2a06-e786-4f1b-9810-c246aa9459ff"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:55.003959 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003933 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3f2e2a06-e786-4f1b-9810-c246aa9459ff" (UID: "3f2e2a06-e786-4f1b-9810-c246aa9459ff"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:55.004009 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003953 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:03:55.004009 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.003976 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-oauth-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:03:55.004092 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.004004 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-service-ca" (OuterVolumeSpecName: "service-ca") pod "3f2e2a06-e786-4f1b-9810-c246aa9459ff" (UID: "3f2e2a06-e786-4f1b-9810-c246aa9459ff"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:55.005604 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.005561 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3f2e2a06-e786-4f1b-9810-c246aa9459ff" (UID: "3f2e2a06-e786-4f1b-9810-c246aa9459ff"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:03:55.005715 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.005628 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3f2e2a06-e786-4f1b-9810-c246aa9459ff" (UID: "3f2e2a06-e786-4f1b-9810-c246aa9459ff"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:03:55.005715 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.005643 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f2e2a06-e786-4f1b-9810-c246aa9459ff-kube-api-access-68fgj" (OuterVolumeSpecName: "kube-api-access-68fgj") pod "3f2e2a06-e786-4f1b-9810-c246aa9459ff" (UID: "3f2e2a06-e786-4f1b-9810-c246aa9459ff"). InnerVolumeSpecName "kube-api-access-68fgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:03:55.104677 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.104648 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-trusted-ca-bundle\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:03:55.104677 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.104675 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f2e2a06-e786-4f1b-9810-c246aa9459ff-service-ca\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:03:55.104677 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.104686 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-68fgj\" (UniqueName: \"kubernetes.io/projected/3f2e2a06-e786-4f1b-9810-c246aa9459ff-kube-api-access-68fgj\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:03:55.104925 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.104695 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-oauth-config\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:03:55.104925 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.104704 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f2e2a06-e786-4f1b-9810-c246aa9459ff-console-serving-cert\") on node \"ip-10-0-140-130.ec2.internal\" DevicePath \"\"" Apr 25 00:03:55.722976 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.722944 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77bfd744f-6nzkj_3f2e2a06-e786-4f1b-9810-c246aa9459ff/console/0.log" Apr 25 00:03:55.723393 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.722991 2566 generic.go:358] "Generic (PLEG): container finished" podID="3f2e2a06-e786-4f1b-9810-c246aa9459ff" containerID="68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05" exitCode=2 Apr 25 00:03:55.723393 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.723052 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77bfd744f-6nzkj" Apr 25 00:03:55.723497 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.723070 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77bfd744f-6nzkj" event={"ID":"3f2e2a06-e786-4f1b-9810-c246aa9459ff","Type":"ContainerDied","Data":"68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05"} Apr 25 00:03:55.723548 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.723490 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77bfd744f-6nzkj" event={"ID":"3f2e2a06-e786-4f1b-9810-c246aa9459ff","Type":"ContainerDied","Data":"29101c1b35dd9408b5c78157df9dce98ad200c20b5cecf246ed34be55c570aee"} Apr 25 00:03:55.723548 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.723525 2566 scope.go:117] "RemoveContainer" containerID="68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05" Apr 25 00:03:55.738581 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.738545 2566 scope.go:117] "RemoveContainer" containerID="68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05" Apr 25 00:03:55.738838 ip-10-0-140-130 kubenswrapper[2566]: E0425 00:03:55.738814 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05\": container with ID starting with 68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05 not found: ID does not exist" containerID="68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05" Apr 25 00:03:55.738910 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.738846 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05"} err="failed to get container status \"68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05\": rpc error: code = NotFound desc = could not find container \"68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05\": container with ID starting with 68273d5c4e060c8d1a498761bb982900eaf56e49c74b7c3b65381ccee04a7d05 not found: ID does not exist" Apr 25 00:03:55.748395 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.748373 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77bfd744f-6nzkj"] Apr 25 00:03:55.751903 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:55.751882 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77bfd744f-6nzkj"] Apr 25 00:03:56.730310 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:56.730272 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f2e2a06-e786-4f1b-9810-c246aa9459ff" path="/var/lib/kubelet/pods/3f2e2a06-e786-4f1b-9810-c246aa9459ff/volumes" Apr 25 00:03:59.320456 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.320420 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw"] Apr 25 00:03:59.320852 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.320811 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f2e2a06-e786-4f1b-9810-c246aa9459ff" containerName="console" Apr 25 00:03:59.320852 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.320823 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2e2a06-e786-4f1b-9810-c246aa9459ff" containerName="console" Apr 25 00:03:59.320935 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.320872 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f2e2a06-e786-4f1b-9810-c246aa9459ff" containerName="console" Apr 25 00:03:59.325126 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.325101 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.327829 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.327793 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 25 00:03:59.327960 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.327884 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 25 00:03:59.329261 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.329239 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dfwlg\"" Apr 25 00:03:59.331522 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.331499 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw"] Apr 25 00:03:59.444207 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.444163 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a2631a91-c9e8-4262-965c-a0ea52821875-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.444410 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.444269 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxx8v\" (UniqueName: \"kubernetes.io/projected/a2631a91-c9e8-4262-965c-a0ea52821875-kube-api-access-nxx8v\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.444410 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.444322 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/a2631a91-c9e8-4262-965c-a0ea52821875-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.545508 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.545471 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxx8v\" (UniqueName: \"kubernetes.io/projected/a2631a91-c9e8-4262-965c-a0ea52821875-kube-api-access-nxx8v\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.545686 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.545516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/a2631a91-c9e8-4262-965c-a0ea52821875-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.545686 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.545599 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a2631a91-c9e8-4262-965c-a0ea52821875-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.546014 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.545995 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a2631a91-c9e8-4262-965c-a0ea52821875-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.548138 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.548119 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/a2631a91-c9e8-4262-965c-a0ea52821875-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.555084 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.555063 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxx8v\" (UniqueName: \"kubernetes.io/projected/a2631a91-c9e8-4262-965c-a0ea52821875-kube-api-access-nxx8v\") pod \"seaweedfs-tls-custom-5c88b85bb7-h76zw\" (UID: \"a2631a91-c9e8-4262-965c-a0ea52821875\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.635661 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.635550 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" Apr 25 00:03:59.759422 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:03:59.759398 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw"] Apr 25 00:03:59.762143 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:03:59.762109 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2631a91_c9e8_4262_965c_a0ea52821875.slice/crio-5b27aa94b0a51f844dd06a76441c29760c706a0bc83b0e1b3bc0d2a5ef0ed15a WatchSource:0}: Error finding container 5b27aa94b0a51f844dd06a76441c29760c706a0bc83b0e1b3bc0d2a5ef0ed15a: Status 404 returned error can't find the container with id 5b27aa94b0a51f844dd06a76441c29760c706a0bc83b0e1b3bc0d2a5ef0ed15a Apr 25 00:04:00.743514 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:04:00.743477 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" event={"ID":"a2631a91-c9e8-4262-965c-a0ea52821875","Type":"ContainerStarted","Data":"5b27aa94b0a51f844dd06a76441c29760c706a0bc83b0e1b3bc0d2a5ef0ed15a"} Apr 25 00:04:02.754511 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:04:02.754472 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" event={"ID":"a2631a91-c9e8-4262-965c-a0ea52821875","Type":"ContainerStarted","Data":"172ac9b3bff1da13eac7cce9bf9de7dabe54b97604208b7e85cf40c44484c747"} Apr 25 00:04:02.770706 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:04:02.770646 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-h76zw" podStartSLOduration=1.318291211 podStartE2EDuration="3.770628958s" podCreationTimestamp="2026-04-25 00:03:59 +0000 UTC" firstStartedPulling="2026-04-25 00:03:59.763426787 +0000 UTC m=+625.627919052" lastFinishedPulling="2026-04-25 00:04:02.215764526 +0000 UTC m=+628.080256799" observedRunningTime="2026-04-25 00:04:02.769489949 +0000 UTC m=+628.633982238" watchObservedRunningTime="2026-04-25 00:04:02.770628958 +0000 UTC m=+628.635121247" Apr 25 00:08:34.667468 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:08:34.667436 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:08:34.670251 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:08:34.670232 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:13:34.691197 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:13:34.691165 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:13:34.694482 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:13:34.694457 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:18:34.714860 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:18:34.714826 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:18:34.719272 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:18:34.719250 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:23:34.739289 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:23:34.739259 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:23:34.746767 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:23:34.746743 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:28:34.763950 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:28:34.763919 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:28:34.770999 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:28:34.770975 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:33:34.787270 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:33:34.787241 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:33:34.794769 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:33:34.794747 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:38:34.814830 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:38:34.814739 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:38:34.823878 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:38:34.823857 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:43:34.837210 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:43:34.837178 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:43:34.847824 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:43:34.847802 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:48:34.860081 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:48:34.860048 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:48:34.872045 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:48:34.872019 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:53:34.883172 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:53:34.883147 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:53:34.896860 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:53:34.896831 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:58:34.905389 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:58:34.905362 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:58:34.919042 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:58:34.919022 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:59:33.790110 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:33.790030 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qr6ml_c5384178-0a8f-4c23-96ba-bcbe045f676c/global-pull-secret-syncer/0.log" Apr 25 00:59:33.887754 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:33.887717 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-r4l6n_e4741ff1-3c93-48d0-8650-7f35e738b042/konnectivity-agent/0.log" Apr 25 00:59:33.995557 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:33.995527 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-130.ec2.internal_4f4adde147201157e73b34681d8e0de6/haproxy/0.log" Apr 25 00:59:37.782755 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:37.782724 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8569fb9b5c-8mzsg_7a35a06b-eb3e-4b02-86b4-6b9b66124779/metrics-server/0.log" Apr 25 00:59:38.026769 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.026742 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qxt66_2506cd54-279b-4478-bf09-69d1721b7bee/node-exporter/0.log" Apr 25 00:59:38.054359 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.054279 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qxt66_2506cd54-279b-4478-bf09-69d1721b7bee/kube-rbac-proxy/0.log" Apr 25 00:59:38.078009 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.077987 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qxt66_2506cd54-279b-4478-bf09-69d1721b7bee/init-textfile/0.log" Apr 25 00:59:38.441046 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.441014 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-x6n29_59b7a531-2098-4fe7-a25f-ff5d0983ec0c/prometheus-operator/0.log" Apr 25 00:59:38.462305 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.462274 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-x6n29_59b7a531-2098-4fe7-a25f-ff5d0983ec0c/kube-rbac-proxy/0.log" Apr 25 00:59:38.497098 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.497067 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-gc5fc_03e56db5-a036-4978-be51-99e3748fbdc4/prometheus-operator-admission-webhook/0.log" Apr 25 00:59:38.543022 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.542994 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d46c47bd5-z7f8g_db915ee5-9581-4aea-8dd6-0db56a3017b1/telemeter-client/0.log" Apr 25 00:59:38.569228 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.569201 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d46c47bd5-z7f8g_db915ee5-9581-4aea-8dd6-0db56a3017b1/reload/0.log" Apr 25 00:59:38.592472 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.592449 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d46c47bd5-z7f8g_db915ee5-9581-4aea-8dd6-0db56a3017b1/kube-rbac-proxy/0.log" Apr 25 00:59:38.630135 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.630109 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69d9f4f478-2gp9t_26a6d56b-6407-48fb-bc9c-72e2a36ad99f/thanos-query/0.log" Apr 25 00:59:38.651528 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.651503 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69d9f4f478-2gp9t_26a6d56b-6407-48fb-bc9c-72e2a36ad99f/kube-rbac-proxy-web/0.log" Apr 25 00:59:38.673250 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.673226 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69d9f4f478-2gp9t_26a6d56b-6407-48fb-bc9c-72e2a36ad99f/kube-rbac-proxy/0.log" Apr 25 00:59:38.695130 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.695070 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69d9f4f478-2gp9t_26a6d56b-6407-48fb-bc9c-72e2a36ad99f/prom-label-proxy/0.log" Apr 25 00:59:38.718874 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.718853 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69d9f4f478-2gp9t_26a6d56b-6407-48fb-bc9c-72e2a36ad99f/kube-rbac-proxy-rules/0.log" Apr 25 00:59:38.740260 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:38.740233 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69d9f4f478-2gp9t_26a6d56b-6407-48fb-bc9c-72e2a36ad99f/kube-rbac-proxy-metrics/0.log" Apr 25 00:59:39.759258 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:39.759214 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-t7llw_1bb33cce-974b-42c1-aafe-f821da1a3f63/networking-console-plugin/0.log" Apr 25 00:59:40.460312 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.460279 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77b66ccdd8-4cs5q_c60b6ee6-928f-444d-8479-bb2124c901f2/console/0.log" Apr 25 00:59:40.501558 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.501529 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rnv26_b365c4f6-54a2-4538-bc8c-68262709ee19/download-server/0.log" Apr 25 00:59:40.574897 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.574855 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch"] Apr 25 00:59:40.578416 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.578391 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.580928 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.580906 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-htmzv\"/\"default-dockercfg-tmtwr\"" Apr 25 00:59:40.580928 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.580925 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-htmzv\"/\"openshift-service-ca.crt\"" Apr 25 00:59:40.581110 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.580953 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-htmzv\"/\"kube-root-ca.crt\"" Apr 25 00:59:40.586081 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.586055 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch"] Apr 25 00:59:40.623729 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.623688 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-sys\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.623901 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.623758 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-proc\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.623901 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.623808 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-lib-modules\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.623901 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.623831 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-podres\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.623901 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.623854 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2hxz\" (UniqueName: \"kubernetes.io/projected/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-kube-api-access-c2hxz\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.724919 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.724818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-podres\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.724919 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.724856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2hxz\" (UniqueName: \"kubernetes.io/projected/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-kube-api-access-c2hxz\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.724919 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.724905 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-sys\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.725150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.724954 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-proc\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.725150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.724997 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-lib-modules\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.725150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.724995 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-podres\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.725150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.725021 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-sys\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.725150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.725073 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-proc\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.725150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.725101 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-lib-modules\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.733329 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.733305 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2hxz\" (UniqueName: \"kubernetes.io/projected/273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1-kube-api-access-c2hxz\") pod \"perf-node-gather-daemonset-w4mch\" (UID: \"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:40.884524 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.884493 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-l6dz5_0953ba64-8811-4f3a-a698-e921f97eab59/volume-data-source-validator/0.log" Apr 25 00:59:40.889347 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:40.889327 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:41.008214 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:41.008179 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch"] Apr 25 00:59:41.011008 ip-10-0-140-130 kubenswrapper[2566]: W0425 00:59:41.010983 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod273fc007_6a4e_4acc_b8d4_fdd9e5fdeec1.slice/crio-88b50bc4791aecad483e1ddf7e0d2d84bc1c07be8480782a8a563fbe6d1d58ea WatchSource:0}: Error finding container 88b50bc4791aecad483e1ddf7e0d2d84bc1c07be8480782a8a563fbe6d1d58ea: Status 404 returned error can't find the container with id 88b50bc4791aecad483e1ddf7e0d2d84bc1c07be8480782a8a563fbe6d1d58ea Apr 25 00:59:41.012510 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:41.012493 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:59:41.062625 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:41.062594 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" event={"ID":"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1","Type":"ContainerStarted","Data":"88b50bc4791aecad483e1ddf7e0d2d84bc1c07be8480782a8a563fbe6d1d58ea"} Apr 25 00:59:41.573077 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:41.573038 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xcntc_6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f/dns/0.log" Apr 25 00:59:41.591602 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:41.591576 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xcntc_6a5ab5f2-8430-48bf-b20f-c8e1fa32c31f/kube-rbac-proxy/0.log" Apr 25 00:59:41.652726 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:41.652694 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tdkpx_07e0eebb-8365-490f-b2b2-16f26075fac7/dns-node-resolver/0.log" Apr 25 00:59:42.066788 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:42.066752 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" event={"ID":"273fc007-6a4e-4acc-b8d4-fdd9e5fdeec1","Type":"ContainerStarted","Data":"87d492ff6fe992c6ed1683761217986152c761f35e1497e49e168d30530e42eb"} Apr 25 00:59:42.067164 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:42.066898 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:42.082907 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:42.082858 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" podStartSLOduration=2.082844748 podStartE2EDuration="2.082844748s" podCreationTimestamp="2026-04-25 00:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:59:42.081279547 +0000 UTC m=+3967.945771834" watchObservedRunningTime="2026-04-25 00:59:42.082844748 +0000 UTC m=+3967.947337036" Apr 25 00:59:42.094963 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:42.094936 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-pruner-29617920-629pq_90a6dc00-7da1-4523-ba9c-8d66c7704998/image-pruner/0.log" Apr 25 00:59:42.196078 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:42.196031 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nqwjk_cdb63bec-b61b-4953-b2fc-7f06ee7063ac/node-ca/0.log" Apr 25 00:59:42.848696 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:42.848665 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-c997485db-f7mxn_71a738cc-01a9-4ee7-ba02-0e53fcb15a2f/router/0.log" Apr 25 00:59:43.195396 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:43.195360 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-v7wqx_7fe18020-a109-4021-a4a7-567311f209f4/serve-healthcheck-canary/0.log" Apr 25 00:59:43.518225 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:43.518137 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-s5tx5_1d003339-504a-4e95-aba4-a47bafe0f0d6/insights-operator/0.log" Apr 25 00:59:43.520049 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:43.520027 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-s5tx5_1d003339-504a-4e95-aba4-a47bafe0f0d6/insights-operator/1.log" Apr 25 00:59:43.672067 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:43.672037 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lwhhh_851c8e1e-3697-4187-858a-e65677890b54/kube-rbac-proxy/0.log" Apr 25 00:59:43.690482 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:43.690455 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lwhhh_851c8e1e-3697-4187-858a-e65677890b54/exporter/0.log" Apr 25 00:59:43.709967 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:43.709946 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lwhhh_851c8e1e-3697-4187-858a-e65677890b54/extractor/0.log" Apr 25 00:59:45.705623 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:45.705583 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-64c4d9588d-ggrlp_25fe2e2f-9a36-4a32-940e-d57e6113c6a6/manager/0.log" Apr 25 00:59:45.725099 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:45.725068 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-ngfw7_17989510-c022-447b-8cc2-94a511145bc5/manager/0.log" Apr 25 00:59:46.107976 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:46.107895 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-h76zw_a2631a91-c9e8-4262-965c-a0ea52821875/seaweedfs-tls-custom/0.log" Apr 25 00:59:48.080848 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:48.080819 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-w4mch" Apr 25 00:59:51.067730 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.067699 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-44g68_93b15e14-7d7c-4b19-bb25-2e48ae26af80/kube-multus-additional-cni-plugins/0.log" Apr 25 00:59:51.087874 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.087851 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-44g68_93b15e14-7d7c-4b19-bb25-2e48ae26af80/egress-router-binary-copy/0.log" Apr 25 00:59:51.107232 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.107208 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-44g68_93b15e14-7d7c-4b19-bb25-2e48ae26af80/cni-plugins/0.log" Apr 25 00:59:51.126397 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.126377 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-44g68_93b15e14-7d7c-4b19-bb25-2e48ae26af80/bond-cni-plugin/0.log" Apr 25 00:59:51.146519 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.146495 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-44g68_93b15e14-7d7c-4b19-bb25-2e48ae26af80/routeoverride-cni/0.log" Apr 25 00:59:51.167889 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.167861 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-44g68_93b15e14-7d7c-4b19-bb25-2e48ae26af80/whereabouts-cni-bincopy/0.log" Apr 25 00:59:51.193229 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.193206 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-44g68_93b15e14-7d7c-4b19-bb25-2e48ae26af80/whereabouts-cni/0.log" Apr 25 00:59:51.529468 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.529434 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fs6wc_11e92e50-ca0e-4133-9e99-69897f47de51/kube-multus/0.log" Apr 25 00:59:51.659126 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.659088 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wf82j_e101d25b-89b6-4522-8e39-35b94ce4d935/network-metrics-daemon/0.log" Apr 25 00:59:51.675429 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:51.675400 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wf82j_e101d25b-89b6-4522-8e39-35b94ce4d935/kube-rbac-proxy/0.log" Apr 25 00:59:52.964560 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:52.964528 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-controller/0.log" Apr 25 00:59:52.980709 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:52.980678 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/0.log" Apr 25 00:59:53.015435 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:53.015402 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovn-acl-logging/1.log" Apr 25 00:59:53.037861 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:53.037826 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/kube-rbac-proxy-node/0.log" Apr 25 00:59:53.059608 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:53.059551 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:59:53.080437 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:53.080416 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/northd/0.log" Apr 25 00:59:53.099315 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:53.099288 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/nbdb/0.log" Apr 25 00:59:53.117294 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:53.117262 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/sbdb/0.log" Apr 25 00:59:53.316055 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:53.315973 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7lm4_b4228f03-25e8-4a96-b72d-5f9fa76ee207/ovnkube-controller/0.log" Apr 25 00:59:54.294150 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:54.294111 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-v8dd2_58df42ab-cad3-4814-9298-b1098600ccdc/network-check-target-container/0.log" Apr 25 00:59:55.139945 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:55.139916 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-cgg9j_2adf0675-4eb9-4ad4-8d83-22c4f20fa0a0/iptables-alerter/0.log" Apr 25 00:59:55.750128 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:55.750088 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-brxjn_349e5253-376d-444a-b099-86d3fb1b6b37/tuned/0.log" Apr 25 00:59:57.339435 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:57.339401 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-wcrlg_44c53046-7de7-4ebe-b217-c83e9db484c4/cluster-samples-operator/0.log" Apr 25 00:59:57.355531 ip-10-0-140-130 kubenswrapper[2566]: I0425 00:59:57.355509 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-wcrlg_44c53046-7de7-4ebe-b217-c83e9db484c4/cluster-samples-operator-watch/0.log"